Mar 20 08:22:53 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 08:22:53 crc restorecon[4757]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:22:53 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:54 crc restorecon[4757]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:22:54 crc restorecon[4757]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 08:22:55 crc kubenswrapper[4903]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:22:55 crc kubenswrapper[4903]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 08:22:55 crc kubenswrapper[4903]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:22:55 crc kubenswrapper[4903]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:22:55 crc kubenswrapper[4903]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 08:22:55 crc kubenswrapper[4903]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.193750 4903 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200197 4903 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200220 4903 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200226 4903 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200232 4903 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200238 4903 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200244 4903 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200251 4903 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200257 4903 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200271 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200279 4903 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200285 4903 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200290 4903 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200295 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200301 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200307 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200314 4903 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200321 4903 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200327 4903 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200333 4903 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200339 4903 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200344 4903 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200350 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200355 4903 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200360 4903 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200366 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200371 4903 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200376 4903 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200384 4903 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200391 4903 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200398 4903 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200404 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200411 4903 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200418 4903 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200423 4903 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200429 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200435 4903 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200440 4903 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200446 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200451 4903 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200456 4903 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200462 4903 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200469 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200474 4903 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200480 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200489 4903 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200495 4903 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200501 4903 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200507 4903 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200512 4903 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200518 4903 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200523 4903 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200528 4903 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200534 4903 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200539 4903 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200546 4903 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200552 4903 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200558 4903 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200564 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200569 4903 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200575 4903 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200580 4903 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200585 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200591 4903 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200596 4903 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200602 4903 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200609 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200614 4903 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200619 4903 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200625 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200630 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.200635 4903 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200777 4903 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200792 4903 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200805 4903 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200814 4903 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200823 4903 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200830 4903 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200839 4903 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200847 4903 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200854 4903 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200861 4903 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200869 4903 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200875 4903 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200881 4903 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200888 4903 flags.go:64] FLAG: --cgroup-root="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200894 4903 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200901 4903 flags.go:64] FLAG: --client-ca-file="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200907 4903 flags.go:64] FLAG: --cloud-config="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200913 4903 flags.go:64] FLAG: --cloud-provider="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200919 4903 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200927 4903 flags.go:64] FLAG: --cluster-domain="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200933 4903 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200939 4903 flags.go:64] FLAG: --config-dir="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200945 4903 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200952 4903 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200961 4903 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200967 4903 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200974 4903 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200980 4903 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200986 4903 flags.go:64] FLAG: --contention-profiling="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200993 4903 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.200999 4903 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201005 4903 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201012 4903 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201019 4903 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201026 4903 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201065 4903 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201072 4903 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201078 4903 flags.go:64] FLAG: --enable-server="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201084 4903 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201094 4903 flags.go:64] FLAG: --event-burst="100" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201100 4903 flags.go:64] FLAG: --event-qps="50" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201107 4903 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201113 4903 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201119 4903 flags.go:64] FLAG: --eviction-hard="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201127 4903 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201133 4903 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201139 4903 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201146 4903 flags.go:64] FLAG: --eviction-soft="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201152 4903 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201157 4903 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201164 4903 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201170 4903 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201176 4903 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201182 4903 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201188 4903 flags.go:64] FLAG: --feature-gates="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201195 4903 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201202 4903 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201208 4903 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201215 4903 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201221 4903 flags.go:64] FLAG: --healthz-port="10248" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201227 4903 flags.go:64] FLAG: --help="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201233 4903 flags.go:64] FLAG: --hostname-override="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201239 4903 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201245 4903 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201252 4903 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201258 4903 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201267 4903 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201274 4903 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201281 4903 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201287 4903 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201293 4903 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201299 4903 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201305 4903 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201312 4903 flags.go:64] FLAG: --kube-reserved="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201318 4903 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201323 4903 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201330 4903 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201337 4903 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201343 4903 flags.go:64] FLAG: --lock-file="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201349 4903 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201356 4903 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201362 4903 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201371 4903 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201377 4903 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201383 4903 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201389 4903 flags.go:64] FLAG: --logging-format="text" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201395 4903 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201427 4903 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201433 4903 flags.go:64] FLAG: --manifest-url="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201440 4903 flags.go:64] FLAG: --manifest-url-header="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201448 4903 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201455 4903 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201462 4903 flags.go:64] FLAG: --max-pods="110" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201468 4903 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201474 4903 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201480 4903 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201486 4903 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201492 4903 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201499 4903 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201506 4903 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201520 4903 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201526 4903 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201532 4903 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201538 4903 flags.go:64] FLAG: --pod-cidr="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201545 4903 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201565 4903 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201570 4903 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201577 4903 flags.go:64] FLAG: --pods-per-core="0" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201583 4903 flags.go:64] FLAG: --port="10250" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201589 4903 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201595 4903 flags.go:64] FLAG: --provider-id="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201601 4903 flags.go:64] FLAG: --qos-reserved="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201607 4903 flags.go:64] FLAG: --read-only-port="10255" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201615 4903 flags.go:64] FLAG: --register-node="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201622 4903 flags.go:64] FLAG: --register-schedulable="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201628 4903 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201643 4903 flags.go:64] FLAG: --registry-burst="10" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201649 4903 flags.go:64] FLAG: --registry-qps="5" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201655 4903 flags.go:64] FLAG: --reserved-cpus="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201661 4903 flags.go:64] FLAG: --reserved-memory="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201669 4903 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201675 4903 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201681 4903 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201687 4903 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201693 4903 flags.go:64] FLAG: --runonce="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201699 4903 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201705 4903 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201711 4903 flags.go:64] FLAG: --seccomp-default="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201717 4903 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201723 4903 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201730 4903 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201737 4903 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201744 4903 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201751 4903 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201757 4903 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201764 4903 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201771 4903 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201779 4903 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201800 4903 flags.go:64] FLAG: --system-cgroups="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201829 4903 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201839 4903 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201845 4903 flags.go:64] FLAG: --tls-cert-file="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201850 4903 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201857 4903 flags.go:64] FLAG: --tls-min-version="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201864 4903 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201869 4903 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201876 4903 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201882 4903 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201888 4903 flags.go:64] FLAG: --v="2" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201898 4903 flags.go:64] FLAG: --version="false" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201905 4903 flags.go:64] FLAG: --vmodule="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201913 4903 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.201919 4903 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202082 4903 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202090 4903 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202096 4903 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202102 4903 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202108 4903 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202113 4903 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202118 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202126 4903 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202134 4903 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202140 4903 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202147 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202152 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202159 4903 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202165 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202171 4903 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202176 4903 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202182 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202187 4903 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202192 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202198 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202203 4903 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202209 4903 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202214 4903 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202219 4903 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202224 4903 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202230 4903 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202235 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202240 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202245 4903 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202251 4903 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202256 4903 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202262 4903 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202268 4903 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202273 4903 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202279 4903 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202284 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202289 4903 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202294 4903 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202299 4903 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202304 4903 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202310 4903 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202315 4903 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202321 4903 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202326 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202332 4903 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202337 4903 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202342 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202347 4903 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202353 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202358 4903 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202363 4903 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202368 4903 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202374 4903 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202379 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202386 4903 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202393 4903 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202399 4903 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202405 4903 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202410 4903 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202416 4903 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202422 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202428 4903 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202434 4903 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202443 4903 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202449 4903 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202455 4903 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202462 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202469 4903 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202476 4903 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202481 4903 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.202487 4903 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.202496 4903 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.216865 4903 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.216949 4903 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217146 4903 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217174 4903 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217186 4903 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217199 4903 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217209 4903 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217218 4903 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217231 4903 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217247 4903 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217258 4903 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217268 4903 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217277 4903 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217286 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217295 4903 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217305 4903 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217314 4903 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217324 4903 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217332 4903 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217341 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217351 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217361 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217371 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217380 4903 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217391 4903 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217400 4903 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217448 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217457 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217466 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217474 4903 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217484 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217493 4903 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217502 4903 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217511 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217519 4903 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217528 4903 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217537 4903 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217545 4903 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217554 4903 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217563 4903 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217571 4903 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217580 4903 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217589 4903 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217598 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217606 4903 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217615 4903 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217625 4903 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217634 4903 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217645 4903 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217656 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217664 4903 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217674 4903 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217683 4903 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217692 4903 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217704 4903 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217715 4903 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217724 4903 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217734 4903 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217743 4903 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217751 4903 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217760 4903 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217770 4903 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217779 4903 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217789 4903 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217797 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217809 4903 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217820 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217830 4903 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217839 4903 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217848 4903 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217856 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217865 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.217874 4903 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.217889 4903 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218196 4903 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218216 4903 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218225 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218235 4903 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218243 4903 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218253 4903 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218262 4903 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218271 4903 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218280 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218288 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218301 4903 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218310 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218319 4903 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218329 4903 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218339 4903 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218348 4903 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218357 4903 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218366 4903 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218376 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218385 4903 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218393 4903 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218402 4903 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218411 4903 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218422 4903 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218431 4903 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218439 4903 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218450 4903 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218459 4903 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218468 4903 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218478 4903 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218487 4903 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218495 4903 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218504 4903 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218512 4903 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218522 4903 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218530 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218541 4903 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218554 4903 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218566 4903 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218578 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218592 4903 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218603 4903 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218621 4903 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218636 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218647 4903 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218659 4903 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218671 4903 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218682 4903 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218693 4903 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218705 4903 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218719 4903 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218729 4903 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218739 4903 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218748 4903 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218757 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218767 4903 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218777 4903 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218785 4903 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218795 4903 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218804 4903 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218813 4903 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218823 4903 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218832 4903 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218841 4903 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218849 4903 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218858 4903 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218867 4903 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218876 4903 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218884 4903 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218896 4903 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.218906 4903 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.218921 4903 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.220417 4903 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.226502 4903 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.231428 4903 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.231600 4903 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.233790 4903 server.go:997] "Starting client certificate rotation" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.233854 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.234059 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.261018 4903 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.264164 4903 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.264473 4903 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.285441 4903 log.go:25] "Validated CRI v1 runtime API" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.322677 4903 log.go:25] "Validated CRI v1 image API" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.325807 4903 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.332387 4903 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-08-17-57-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.332470 4903 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.366366 4903 manager.go:217] Machine: {Timestamp:2026-03-20 08:22:55.362946726 +0000 UTC m=+0.579847111 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:39716343-11aa-4130-bd5e-584ebc4907c0 BootID:2fafe47f-e5df-46e6-9c53-b5b631ab61f4 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:80:b5:4c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:80:b5:4c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:66:c2:45 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:1a:c6:40 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:1f:11:8d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:55:47:a7 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:76:8d:ef Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ca:1e:5e:15:87:ef Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:7a:cd:d1:bf:08:ba Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.366850 4903 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.367111 4903 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.367683 4903 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.368009 4903 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.368107 4903 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.368592 4903 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.368615 4903 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.369392 4903 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.369494 4903 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.370475 4903 state_mem.go:36] "Initialized new in-memory state store" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.370658 4903 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.375713 4903 kubelet.go:418] "Attempting to sync node with API server" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.375770 4903 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.375818 4903 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.375844 4903 kubelet.go:324] "Adding apiserver pod source" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.375865 4903 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.381283 4903 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.382741 4903 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.383403 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.383545 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.383550 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.383665 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.385146 4903 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387402 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387450 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387466 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387480 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387504 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387519 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387533 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387570 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387595 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387611 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387653 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.387667 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.388528 4903 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.389565 4903 server.go:1280] "Started kubelet" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.390427 4903 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.389893 4903 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.391349 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.391962 4903 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 08:22:55 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.402916 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.402976 4903 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.403154 4903 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.403186 4903 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.403408 4903 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.404179 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.405510 4903 factory.go:55] Registering systemd factory Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.405560 4903 factory.go:221] Registration of the systemd container factory successfully Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.403585 4903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e7f07f56b25b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,LastTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.406161 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.406329 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.409063 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="200ms" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.409409 4903 factory.go:153] Registering CRI-O factory Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.409496 4903 factory.go:221] Registration of the crio container factory successfully Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.409637 4903 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.409715 4903 factory.go:103] Registering Raw factory Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.409846 4903 manager.go:1196] Started watching for new ooms in manager Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.414184 4903 server.go:460] "Adding debug handlers to kubelet server" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.414467 4903 manager.go:319] Starting recovery of all containers Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.420403 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.420597 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.420719 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.420975 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.421161 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.421279 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.421390 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.421556 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.421690 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.421813 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.421926 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.422062 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.422199 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.422319 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.422440 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.422553 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.422669 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.422783 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.422894 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.423018 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.423166 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.423279 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.423409 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.423527 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.423658 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.423787 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.423913 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.424068 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.424219 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.424349 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.424466 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.424590 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.424709 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.424826 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.424944 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.425197 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.425323 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.425482 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.425602 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.425731 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.425854 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.426058 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.426188 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.426302 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.426415 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.426553 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.426672 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.426781 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.426900 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.427016 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.427167 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.427299 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.427422 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.427552 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.427671 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.427785 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.427907 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.428024 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.428170 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.428289 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.428403 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.428534 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.428648 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.428758 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.428871 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.428984 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.429145 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.429282 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.429397 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.429530 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.429650 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.429761 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.429881 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.430002 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.430142 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.430310 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.430422 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.430563 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.430677 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.430810 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.430945 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.431083 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434378 4903 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434472 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434501 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434518 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434537 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434555 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434574 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434599 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434617 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434668 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434683 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434709 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434731 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434752 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434774 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434796 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434816 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434837 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434876 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.434991 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435013 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435057 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435079 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435114 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435142 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435167 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435192 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435229 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435253 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435278 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435306 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435329 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435352 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435382 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435404 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435434 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435458 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435482 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435503 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435535 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435559 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435581 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435602 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435634 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435656 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435679 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435705 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435727 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435757 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435779 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435801 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435823 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435844 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435873 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435894 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435923 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435945 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435975 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.435996 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436017 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436099 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436123 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436168 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436197 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436216 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436239 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436268 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436289 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436313 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436335 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436357 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436384 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436406 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436425 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436444 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436465 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436484 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436505 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436529 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436550 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436569 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436590 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436615 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436635 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436655 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436675 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436694 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436714 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436735 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436757 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436782 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436807 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436841 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436862 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436888 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436909 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436929 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436947 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436966 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.436987 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.437123 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.447015 4903 manager.go:324] Recovery completed Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448005 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448165 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448215 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448241 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448767 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448791 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448862 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448941 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448958 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448980 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.448996 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449013 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449164 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449181 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449206 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449219 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449240 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449258 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449274 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449301 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449316 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449337 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449355 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449370 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449386 4903 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449399 4903 reconstruct.go:97] "Volume reconstruction finished" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.449408 4903 reconciler.go:26] "Reconciler: start to sync state" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.467921 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.470219 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.470333 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.470366 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.472077 4903 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.472100 4903 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.472122 4903 state_mem.go:36] "Initialized new in-memory state store" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.486590 4903 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.489516 4903 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.489559 4903 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.489595 4903 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.489804 4903 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.490484 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.490548 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.500421 4903 policy_none.go:49] "None policy: Start" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.501830 4903 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.501880 4903 state_mem.go:35] "Initializing new in-memory state store" Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.504322 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.565618 4903 manager.go:334] "Starting Device Plugin manager" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.565714 4903 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.565736 4903 server.go:79] "Starting device plugin registration server" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.566438 4903 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.566497 4903 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.568484 4903 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.569179 4903 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.569210 4903 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.584387 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.590883 4903 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.591064 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.593124 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.593204 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.593222 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.593556 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.593827 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.593958 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.594899 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.594952 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.594966 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.595233 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.595440 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.595894 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.595643 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.596876 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.596891 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.598114 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.598143 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.598154 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.598383 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.598406 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.598418 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.598521 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.599486 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.599574 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.600491 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.600519 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.600531 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.600644 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.600942 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.601066 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.601421 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.601540 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.601567 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.602176 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.602229 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.602245 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.602536 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.602583 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.603150 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.603198 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.603217 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.604265 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.604305 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.604323 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.611021 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="400ms" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652113 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652206 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652246 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652277 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652312 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652341 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652372 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652404 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652433 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652462 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652493 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652523 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652583 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652612 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.652650 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.666649 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.668517 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.668599 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.668609 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.668661 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.669319 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755075 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755160 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755204 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755237 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755277 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755309 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755340 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755345 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755445 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755473 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755527 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755458 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755448 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755564 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755378 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755706 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755553 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755762 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755738 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755811 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755851 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755856 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755904 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755925 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755948 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755949 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.755978 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.756004 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.756074 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.756112 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.869537 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.871493 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.871567 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.871594 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.871640 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:22:55 crc kubenswrapper[4903]: E0320 08:22:55.872226 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.925959 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.933004 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.962651 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.985692 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.986936 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-33f1a6190cd256f9acca6f415fa86d33d17ce206690f05ef247d4ec45aafb136 WatchSource:0}: Error finding container 33f1a6190cd256f9acca6f415fa86d33d17ce206690f05ef247d4ec45aafb136: Status 404 returned error can't find the container with id 33f1a6190cd256f9acca6f415fa86d33d17ce206690f05ef247d4ec45aafb136 Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.988769 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3dc44286156c53ce509f73bd7998f944b0e29c2b748dfcbb0343bc6007d49a5b WatchSource:0}: Error finding container 3dc44286156c53ce509f73bd7998f944b0e29c2b748dfcbb0343bc6007d49a5b: Status 404 returned error can't find the container with id 3dc44286156c53ce509f73bd7998f944b0e29c2b748dfcbb0343bc6007d49a5b Mar 20 08:22:55 crc kubenswrapper[4903]: I0320 08:22:55.993385 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:22:55 crc kubenswrapper[4903]: W0320 08:22:55.996100 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-dd46bbb6097bb20bbf8fe016c5a8dc9b35afe36db7f017512dced5b0ad4fc448 WatchSource:0}: Error finding container dd46bbb6097bb20bbf8fe016c5a8dc9b35afe36db7f017512dced5b0ad4fc448: Status 404 returned error can't find the container with id dd46bbb6097bb20bbf8fe016c5a8dc9b35afe36db7f017512dced5b0ad4fc448 Mar 20 08:22:56 crc kubenswrapper[4903]: W0320 08:22:56.005228 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8d5007af89a5510cf984cdd88ea10aaf27e39e9a7b7bc7fe09f893d26f439e50 WatchSource:0}: Error finding container 8d5007af89a5510cf984cdd88ea10aaf27e39e9a7b7bc7fe09f893d26f439e50: Status 404 returned error can't find the container with id 8d5007af89a5510cf984cdd88ea10aaf27e39e9a7b7bc7fe09f893d26f439e50 Mar 20 08:22:56 crc kubenswrapper[4903]: W0320 08:22:56.011200 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-3f55a6dd012138da9513d868ba254dc9f0c252e495d9f7d0ed3adc62744f97b3 WatchSource:0}: Error finding container 3f55a6dd012138da9513d868ba254dc9f0c252e495d9f7d0ed3adc62744f97b3: Status 404 returned error can't find the container with id 3f55a6dd012138da9513d868ba254dc9f0c252e495d9f7d0ed3adc62744f97b3 Mar 20 08:22:56 crc kubenswrapper[4903]: E0320 08:22:56.011868 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="800ms" Mar 20 08:22:56 crc kubenswrapper[4903]: W0320 08:22:56.212975 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:56 crc kubenswrapper[4903]: E0320 08:22:56.213136 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:56 crc kubenswrapper[4903]: W0320 08:22:56.218110 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:56 crc kubenswrapper[4903]: E0320 08:22:56.218188 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.272370 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.274149 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.274208 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.274227 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.274265 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:22:56 crc kubenswrapper[4903]: E0320 08:22:56.274935 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 20 08:22:56 crc kubenswrapper[4903]: W0320 08:22:56.293639 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:56 crc kubenswrapper[4903]: E0320 08:22:56.293722 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.392932 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.494199 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"33f1a6190cd256f9acca6f415fa86d33d17ce206690f05ef247d4ec45aafb136"} Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.495676 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3dc44286156c53ce509f73bd7998f944b0e29c2b748dfcbb0343bc6007d49a5b"} Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.496355 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3f55a6dd012138da9513d868ba254dc9f0c252e495d9f7d0ed3adc62744f97b3"} Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.498884 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8d5007af89a5510cf984cdd88ea10aaf27e39e9a7b7bc7fe09f893d26f439e50"} Mar 20 08:22:56 crc kubenswrapper[4903]: I0320 08:22:56.499632 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dd46bbb6097bb20bbf8fe016c5a8dc9b35afe36db7f017512dced5b0ad4fc448"} Mar 20 08:22:56 crc kubenswrapper[4903]: W0320 08:22:56.658624 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:56 crc kubenswrapper[4903]: E0320 08:22:56.659160 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:56 crc kubenswrapper[4903]: E0320 08:22:56.813067 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="1.6s" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.075638 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.077127 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.077193 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.077207 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.077239 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:22:57 crc kubenswrapper[4903]: E0320 08:22:57.077841 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.392556 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.461911 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:22:57 crc kubenswrapper[4903]: E0320 08:22:57.462780 4903 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.504936 4903 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="bd9be554cf71c22070cf33a069a61eb89b6e2b34f7b0ddb254af4f5656f2d44f" exitCode=0 Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.505070 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.505084 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"bd9be554cf71c22070cf33a069a61eb89b6e2b34f7b0ddb254af4f5656f2d44f"} Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.506273 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.506309 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.506321 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.508096 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"05ab2ca588ed5dbc2286c1407a66302b98ec590fbf806657e3212ea1f9772cc0"} Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.508126 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"02e0157fc952540ba5a455bcfdd80df3af92602fdc4f3bce5d3d3bfd2551c582"} Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.508138 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae75429275f0159b6c1185aa41107dac0937f87a905e7218b2cbfa9080f692b2"} Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.508150 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"62d8b4b3660cba4aedff2babcee4801e661d002a87053e009eab4d967a7a8746"} Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.508195 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.509157 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.509190 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.509198 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.509501 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41" exitCode=0 Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.509565 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41"} Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.509624 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.510441 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.510468 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.510479 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.511769 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.511981 4903 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ab19cdd5978cdbd6243d7f778b7f495ea55c31d714bff108e464390ac039b71f" exitCode=0 Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.512069 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ab19cdd5978cdbd6243d7f778b7f495ea55c31d714bff108e464390ac039b71f"} Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.512299 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.512819 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.512866 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.512883 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.514259 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.514295 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.514312 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.514848 4903 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e5a9253adb2d58c4c6ddbfd9c745bbed8446195574f824697c5f68dbf9f741ba" exitCode=0 Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.514889 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e5a9253adb2d58c4c6ddbfd9c745bbed8446195574f824697c5f68dbf9f741ba"} Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.514945 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.518902 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.518942 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:57 crc kubenswrapper[4903]: I0320 08:22:57.518965 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:58 crc kubenswrapper[4903]: W0320 08:22:58.196929 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:58 crc kubenswrapper[4903]: E0320 08:22:58.197022 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.392682 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:58 crc kubenswrapper[4903]: E0320 08:22:58.414467 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="3.2s" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.522666 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374"} Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.522741 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda"} Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.522753 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97"} Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.522765 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533"} Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.525085 4903 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9c56deb08228529d767a088c0d207f3cf2e4147f36173fc0c5d6e7a442b1ab7d" exitCode=0 Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.525144 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9c56deb08228529d767a088c0d207f3cf2e4147f36173fc0c5d6e7a442b1ab7d"} Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.525245 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.527281 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.527315 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.527326 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.529510 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"701809f6ae0c8e846df9efed4df02616cf4ea3d4b60a3cb364ec6d763b6f4a8f"} Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.529599 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.530825 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.530869 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.530884 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.538628 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"96ed373c63d512ec441c9b7c5b9662187a269cdbeef7db22e0b21e29c36e3279"} Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.538695 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bad975580554d2af26b8e54e69c48f28927163401943fcbd200b8e3a309e44aa"} Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.538717 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"875da46f15fee87c192d7576889f9d3062bce3de37122c8e328aa008df051270"} Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.538818 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.539104 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.540113 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.540148 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.540160 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.540317 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.540566 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.540579 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.678054 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.679407 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.679437 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.679448 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.679476 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:22:58 crc kubenswrapper[4903]: E0320 08:22:58.680019 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.164:6443: connect: connection refused" node="crc" Mar 20 08:22:58 crc kubenswrapper[4903]: I0320 08:22:58.826119 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:22:58 crc kubenswrapper[4903]: W0320 08:22:58.918797 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:58 crc kubenswrapper[4903]: E0320 08:22:58.918910 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:58 crc kubenswrapper[4903]: W0320 08:22:58.931982 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.164:6443: connect: connection refused Mar 20 08:22:58 crc kubenswrapper[4903]: E0320 08:22:58.932104 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.164:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.546744 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0de344d58e5ff71a0b8108ee9cc34d337f3099d6f4c9bda5431df65fdb335191"} Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.546801 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.548059 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.548094 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.548106 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.551559 4903 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f5bfce88577977aa641046715b14b9f4a11ec5eebc47a20089c897ca9a78d002" exitCode=0 Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.551792 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.551844 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.551767 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f5bfce88577977aa641046715b14b9f4a11ec5eebc47a20089c897ca9a78d002"} Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.551937 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.552001 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.551860 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.554087 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.554122 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.554133 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.554231 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.554276 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.554300 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.554349 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.554440 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.554521 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.555017 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.555125 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.555155 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.628571 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:22:59 crc kubenswrapper[4903]: I0320 08:22:59.884820 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.559958 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"31bd80902a7a991ade5a4bd9b40d3599f1b3c75c10c8a703c3f092324dca7717"} Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.560024 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"51fdc8a92dbd0647d8dcc5d704c12a8a399161ce69cb4875bcb8562b5c57db48"} Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.560056 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1c2766100b6ed48e1874905380ce68d442680cc68eb9fbd7cc90cb594590d558"} Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.560070 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.560072 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"91b0746a8cd19beb5c3901a89afda5f4f40732e303cb84a16b15e31b07269c26"} Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.560184 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.560295 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.560914 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.560947 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.560957 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.561828 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.561870 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.561881 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:00 crc kubenswrapper[4903]: I0320 08:23:00.638140 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.577466 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd87ba33bb5593b1f240a21ae9b1255ae2042b148141ff2ff41c9070a24b35dc"} Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.577596 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.577720 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.577721 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.579926 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.580013 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.580094 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.581349 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.581403 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.581423 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.582345 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.582471 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.582495 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.853714 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.880570 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.882103 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.882190 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.882211 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:01 crc kubenswrapper[4903]: I0320 08:23:01.882264 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:23:02 crc kubenswrapper[4903]: I0320 08:23:02.580691 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:02 crc kubenswrapper[4903]: I0320 08:23:02.582422 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:02 crc kubenswrapper[4903]: I0320 08:23:02.582472 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:02 crc kubenswrapper[4903]: I0320 08:23:02.582486 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:02 crc kubenswrapper[4903]: I0320 08:23:02.885779 4903 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:23:02 crc kubenswrapper[4903]: I0320 08:23:02.885924 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.692594 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.693846 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.695209 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.695252 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.695266 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.702461 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.711679 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.711993 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.713949 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.714012 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:03 crc kubenswrapper[4903]: I0320 08:23:03.714071 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.095896 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.096116 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.097346 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.097385 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.097398 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.587710 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.589277 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.589352 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.589371 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.739375 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.739687 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.741907 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.741971 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:04 crc kubenswrapper[4903]: I0320 08:23:04.741992 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:05 crc kubenswrapper[4903]: I0320 08:23:05.362815 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 08:23:05 crc kubenswrapper[4903]: I0320 08:23:05.363135 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:05 crc kubenswrapper[4903]: I0320 08:23:05.365148 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:05 crc kubenswrapper[4903]: I0320 08:23:05.365242 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:05 crc kubenswrapper[4903]: I0320 08:23:05.365262 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:05 crc kubenswrapper[4903]: E0320 08:23:05.585187 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:23:08 crc kubenswrapper[4903]: I0320 08:23:08.831525 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:23:08 crc kubenswrapper[4903]: I0320 08:23:08.831708 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:08 crc kubenswrapper[4903]: I0320 08:23:08.833155 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:08 crc kubenswrapper[4903]: I0320 08:23:08.833188 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:08 crc kubenswrapper[4903]: I0320 08:23:08.833199 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.208152 4903 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39018->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.208230 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39018->192.168.126.11:17697: read: connection reset by peer" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.267630 4903 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.267738 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 08:23:09 crc kubenswrapper[4903]: W0320 08:23:09.276952 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.277185 4903 trace.go:236] Trace[547773494]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 08:22:59.275) (total time: 10002ms): Mar 20 08:23:09 crc kubenswrapper[4903]: Trace[547773494]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (08:23:09.276) Mar 20 08:23:09 crc kubenswrapper[4903]: Trace[547773494]: [10.002102436s] [10.002102436s] END Mar 20 08:23:09 crc kubenswrapper[4903]: E0320 08:23:09.277266 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.393545 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 08:23:09 crc kubenswrapper[4903]: E0320 08:23:09.576748 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:09Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 08:23:09 crc kubenswrapper[4903]: W0320 08:23:09.579667 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:09Z is after 2026-02-23T05:33:13Z Mar 20 08:23:09 crc kubenswrapper[4903]: E0320 08:23:09.579778 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:09 crc kubenswrapper[4903]: W0320 08:23:09.582058 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:09Z is after 2026-02-23T05:33:13Z Mar 20 08:23:09 crc kubenswrapper[4903]: E0320 08:23:09.582109 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:09 crc kubenswrapper[4903]: E0320 08:23:09.584720 4903 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:09 crc kubenswrapper[4903]: E0320 08:23:09.585686 4903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:09Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e7f07f56b25b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,LastTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:09 crc kubenswrapper[4903]: E0320 08:23:09.587618 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:09Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 08:23:09 crc kubenswrapper[4903]: W0320 08:23:09.589715 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:09Z is after 2026-02-23T05:33:13Z Mar 20 08:23:09 crc kubenswrapper[4903]: E0320 08:23:09.589807 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.594728 4903 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.594794 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.601185 4903 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.601284 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.612226 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.614801 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0de344d58e5ff71a0b8108ee9cc34d337f3099d6f4c9bda5431df65fdb335191" exitCode=255 Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.614860 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0de344d58e5ff71a0b8108ee9cc34d337f3099d6f4c9bda5431df65fdb335191"} Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.615069 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.616139 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.616194 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.616209 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:09 crc kubenswrapper[4903]: I0320 08:23:09.617065 4903 scope.go:117] "RemoveContainer" containerID="0de344d58e5ff71a0b8108ee9cc34d337f3099d6f4c9bda5431df65fdb335191" Mar 20 08:23:10 crc kubenswrapper[4903]: I0320 08:23:10.395600 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:10Z is after 2026-02-23T05:33:13Z Mar 20 08:23:10 crc kubenswrapper[4903]: I0320 08:23:10.634822 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 08:23:10 crc kubenswrapper[4903]: I0320 08:23:10.637218 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ac4dbbefee91f3f8861565b310f4d941712b0f3d7864b79920c4262de58eb346"} Mar 20 08:23:10 crc kubenswrapper[4903]: I0320 08:23:10.637434 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:10 crc kubenswrapper[4903]: I0320 08:23:10.638756 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:10 crc kubenswrapper[4903]: I0320 08:23:10.638802 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:10 crc kubenswrapper[4903]: I0320 08:23:10.638816 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.397083 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:11Z is after 2026-02-23T05:33:13Z Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.644447 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.645276 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.648519 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ac4dbbefee91f3f8861565b310f4d941712b0f3d7864b79920c4262de58eb346" exitCode=255 Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.648595 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ac4dbbefee91f3f8861565b310f4d941712b0f3d7864b79920c4262de58eb346"} Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.648713 4903 scope.go:117] "RemoveContainer" containerID="0de344d58e5ff71a0b8108ee9cc34d337f3099d6f4c9bda5431df65fdb335191" Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.648949 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.655597 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.655669 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.655686 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:11 crc kubenswrapper[4903]: I0320 08:23:11.656553 4903 scope.go:117] "RemoveContainer" containerID="ac4dbbefee91f3f8861565b310f4d941712b0f3d7864b79920c4262de58eb346" Mar 20 08:23:11 crc kubenswrapper[4903]: E0320 08:23:11.656788 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:23:12 crc kubenswrapper[4903]: I0320 08:23:12.398428 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:12Z is after 2026-02-23T05:33:13Z Mar 20 08:23:12 crc kubenswrapper[4903]: I0320 08:23:12.654440 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 08:23:12 crc kubenswrapper[4903]: I0320 08:23:12.885589 4903 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:23:12 crc kubenswrapper[4903]: I0320 08:23:12.885691 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:23:13 crc kubenswrapper[4903]: I0320 08:23:13.396481 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:13Z is after 2026-02-23T05:33:13Z Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.104702 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.104947 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.107228 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.107295 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.107310 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.108158 4903 scope.go:117] "RemoveContainer" containerID="ac4dbbefee91f3f8861565b310f4d941712b0f3d7864b79920c4262de58eb346" Mar 20 08:23:14 crc kubenswrapper[4903]: E0320 08:23:14.108396 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.111872 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.396545 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:14Z is after 2026-02-23T05:33:13Z Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.663424 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.664591 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.664658 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.664682 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:14 crc kubenswrapper[4903]: I0320 08:23:14.665712 4903 scope.go:117] "RemoveContainer" containerID="ac4dbbefee91f3f8861565b310f4d941712b0f3d7864b79920c4262de58eb346" Mar 20 08:23:14 crc kubenswrapper[4903]: E0320 08:23:14.666026 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:23:15 crc kubenswrapper[4903]: W0320 08:23:15.244447 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:15Z is after 2026-02-23T05:33:13Z Mar 20 08:23:15 crc kubenswrapper[4903]: E0320 08:23:15.244548 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.393655 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.394349 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.395900 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:15Z is after 2026-02-23T05:33:13Z Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.395929 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.395979 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.395995 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.407448 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 08:23:15 crc kubenswrapper[4903]: E0320 08:23:15.585320 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.667115 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.668398 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.668464 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.668490 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:15 crc kubenswrapper[4903]: E0320 08:23:15.980370 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:15Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.988498 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.990609 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.990664 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.990677 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:15 crc kubenswrapper[4903]: I0320 08:23:15.990711 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:23:15 crc kubenswrapper[4903]: E0320 08:23:15.994246 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:15Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 08:23:16 crc kubenswrapper[4903]: I0320 08:23:16.398525 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:16Z is after 2026-02-23T05:33:13Z Mar 20 08:23:16 crc kubenswrapper[4903]: W0320 08:23:16.605752 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:16Z is after 2026-02-23T05:33:13Z Mar 20 08:23:16 crc kubenswrapper[4903]: E0320 08:23:16.606156 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:17 crc kubenswrapper[4903]: I0320 08:23:17.395737 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:17Z is after 2026-02-23T05:33:13Z Mar 20 08:23:17 crc kubenswrapper[4903]: I0320 08:23:17.646893 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:23:17 crc kubenswrapper[4903]: E0320 08:23:17.651964 4903 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:17 crc kubenswrapper[4903]: W0320 08:23:17.850452 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:17Z is after 2026-02-23T05:33:13Z Mar 20 08:23:17 crc kubenswrapper[4903]: E0320 08:23:17.850577 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:18 crc kubenswrapper[4903]: I0320 08:23:18.010287 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:23:18 crc kubenswrapper[4903]: I0320 08:23:18.010607 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:18 crc kubenswrapper[4903]: I0320 08:23:18.012306 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:18 crc kubenswrapper[4903]: I0320 08:23:18.012365 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:18 crc kubenswrapper[4903]: I0320 08:23:18.012388 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:18 crc kubenswrapper[4903]: I0320 08:23:18.013300 4903 scope.go:117] "RemoveContainer" containerID="ac4dbbefee91f3f8861565b310f4d941712b0f3d7864b79920c4262de58eb346" Mar 20 08:23:18 crc kubenswrapper[4903]: E0320 08:23:18.013538 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:23:18 crc kubenswrapper[4903]: I0320 08:23:18.396170 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:18Z is after 2026-02-23T05:33:13Z Mar 20 08:23:19 crc kubenswrapper[4903]: I0320 08:23:19.267328 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:23:19 crc kubenswrapper[4903]: I0320 08:23:19.267517 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:19 crc kubenswrapper[4903]: I0320 08:23:19.268808 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:19 crc kubenswrapper[4903]: I0320 08:23:19.268835 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:19 crc kubenswrapper[4903]: I0320 08:23:19.268843 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:19 crc kubenswrapper[4903]: I0320 08:23:19.269369 4903 scope.go:117] "RemoveContainer" containerID="ac4dbbefee91f3f8861565b310f4d941712b0f3d7864b79920c4262de58eb346" Mar 20 08:23:19 crc kubenswrapper[4903]: E0320 08:23:19.269516 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:23:19 crc kubenswrapper[4903]: I0320 08:23:19.395320 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:19Z is after 2026-02-23T05:33:13Z Mar 20 08:23:19 crc kubenswrapper[4903]: E0320 08:23:19.589138 4903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:19Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e7f07f56b25b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,LastTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:20 crc kubenswrapper[4903]: I0320 08:23:20.394962 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:20Z is after 2026-02-23T05:33:13Z Mar 20 08:23:20 crc kubenswrapper[4903]: W0320 08:23:20.398143 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:20Z is after 2026-02-23T05:33:13Z Mar 20 08:23:20 crc kubenswrapper[4903]: E0320 08:23:20.398296 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:21 crc kubenswrapper[4903]: I0320 08:23:21.395550 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:21Z is after 2026-02-23T05:33:13Z Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.397880 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:22Z is after 2026-02-23T05:33:13Z Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.886147 4903 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.886333 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.886425 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.886628 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.888238 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.888280 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.888294 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.888966 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ae75429275f0159b6c1185aa41107dac0937f87a905e7218b2cbfa9080f692b2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.889279 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ae75429275f0159b6c1185aa41107dac0937f87a905e7218b2cbfa9080f692b2" gracePeriod=30 Mar 20 08:23:22 crc kubenswrapper[4903]: E0320 08:23:22.983757 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:22Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.995133 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.997019 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.997100 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.997120 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:22 crc kubenswrapper[4903]: I0320 08:23:22.997164 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:23:23 crc kubenswrapper[4903]: E0320 08:23:23.003131 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:23Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 08:23:23 crc kubenswrapper[4903]: I0320 08:23:23.395707 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:23Z is after 2026-02-23T05:33:13Z Mar 20 08:23:23 crc kubenswrapper[4903]: I0320 08:23:23.696627 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 08:23:23 crc kubenswrapper[4903]: I0320 08:23:23.697223 4903 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ae75429275f0159b6c1185aa41107dac0937f87a905e7218b2cbfa9080f692b2" exitCode=255 Mar 20 08:23:23 crc kubenswrapper[4903]: I0320 08:23:23.697278 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ae75429275f0159b6c1185aa41107dac0937f87a905e7218b2cbfa9080f692b2"} Mar 20 08:23:23 crc kubenswrapper[4903]: I0320 08:23:23.697363 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c1bee1ec27b18be528bfc7b577684c9d6e17f7cf57e902ff00c9f94f1f3a609e"} Mar 20 08:23:23 crc kubenswrapper[4903]: I0320 08:23:23.697480 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:23 crc kubenswrapper[4903]: I0320 08:23:23.698793 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:23 crc kubenswrapper[4903]: I0320 08:23:23.698823 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:23 crc kubenswrapper[4903]: I0320 08:23:23.698834 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:24 crc kubenswrapper[4903]: I0320 08:23:24.394342 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:24Z is after 2026-02-23T05:33:13Z Mar 20 08:23:25 crc kubenswrapper[4903]: I0320 08:23:25.396251 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:25Z is after 2026-02-23T05:33:13Z Mar 20 08:23:25 crc kubenswrapper[4903]: E0320 08:23:25.585969 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:23:26 crc kubenswrapper[4903]: I0320 08:23:26.395415 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:26Z is after 2026-02-23T05:33:13Z Mar 20 08:23:26 crc kubenswrapper[4903]: W0320 08:23:26.598246 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:26Z is after 2026-02-23T05:33:13Z Mar 20 08:23:26 crc kubenswrapper[4903]: E0320 08:23:26.598350 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:27 crc kubenswrapper[4903]: I0320 08:23:27.395296 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:27Z is after 2026-02-23T05:33:13Z Mar 20 08:23:28 crc kubenswrapper[4903]: I0320 08:23:28.395169 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:28Z is after 2026-02-23T05:33:13Z Mar 20 08:23:29 crc kubenswrapper[4903]: I0320 08:23:29.394702 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:29Z is after 2026-02-23T05:33:13Z Mar 20 08:23:29 crc kubenswrapper[4903]: W0320 08:23:29.481103 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:29Z is after 2026-02-23T05:33:13Z Mar 20 08:23:29 crc kubenswrapper[4903]: E0320 08:23:29.481239 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:29 crc kubenswrapper[4903]: E0320 08:23:29.595157 4903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e7f07f56b25b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,LastTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:29 crc kubenswrapper[4903]: I0320 08:23:29.885370 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:23:29 crc kubenswrapper[4903]: I0320 08:23:29.886214 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:29 crc kubenswrapper[4903]: I0320 08:23:29.888286 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:29 crc kubenswrapper[4903]: I0320 08:23:29.888378 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:29 crc kubenswrapper[4903]: I0320 08:23:29.888398 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:29 crc kubenswrapper[4903]: E0320 08:23:29.990101 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:29Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.004258 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.005793 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.005840 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.005858 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.005897 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:23:30 crc kubenswrapper[4903]: E0320 08:23:30.008842 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:30Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.394797 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:30Z is after 2026-02-23T05:33:13Z Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.637717 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.716398 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.717427 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.717490 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:30 crc kubenswrapper[4903]: I0320 08:23:30.717505 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:31 crc kubenswrapper[4903]: I0320 08:23:31.394414 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:31Z is after 2026-02-23T05:33:13Z Mar 20 08:23:32 crc kubenswrapper[4903]: I0320 08:23:32.396430 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:32Z is after 2026-02-23T05:33:13Z Mar 20 08:23:32 crc kubenswrapper[4903]: I0320 08:23:32.885786 4903 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:23:32 crc kubenswrapper[4903]: I0320 08:23:32.885872 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:23:33 crc kubenswrapper[4903]: W0320 08:23:33.344632 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:33Z is after 2026-02-23T05:33:13Z Mar 20 08:23:33 crc kubenswrapper[4903]: E0320 08:23:33.344729 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.395193 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:33Z is after 2026-02-23T05:33:13Z Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.490627 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.492157 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.492222 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.492237 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.492867 4903 scope.go:117] "RemoveContainer" containerID="ac4dbbefee91f3f8861565b310f4d941712b0f3d7864b79920c4262de58eb346" Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.730016 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.732754 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d03e0f1a252388ab5c9eceeca9ba587a37408ac375fd5c1f8d2aa550e63af428"} Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.732923 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.733777 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.733812 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:33 crc kubenswrapper[4903]: I0320 08:23:33.733823 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:34 crc kubenswrapper[4903]: W0320 08:23:34.080196 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:34Z is after 2026-02-23T05:33:13Z Mar 20 08:23:34 crc kubenswrapper[4903]: E0320 08:23:34.080284 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.394376 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:34Z is after 2026-02-23T05:33:13Z Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.738233 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.738820 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.741022 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d03e0f1a252388ab5c9eceeca9ba587a37408ac375fd5c1f8d2aa550e63af428" exitCode=255 Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.741110 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d03e0f1a252388ab5c9eceeca9ba587a37408ac375fd5c1f8d2aa550e63af428"} Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.741194 4903 scope.go:117] "RemoveContainer" containerID="ac4dbbefee91f3f8861565b310f4d941712b0f3d7864b79920c4262de58eb346" Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.741422 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.742822 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.742843 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.742852 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:34 crc kubenswrapper[4903]: I0320 08:23:34.743454 4903 scope.go:117] "RemoveContainer" containerID="d03e0f1a252388ab5c9eceeca9ba587a37408ac375fd5c1f8d2aa550e63af428" Mar 20 08:23:34 crc kubenswrapper[4903]: E0320 08:23:34.743619 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:23:35 crc kubenswrapper[4903]: I0320 08:23:35.106569 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:23:35 crc kubenswrapper[4903]: E0320 08:23:35.110979 4903 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:23:35 crc kubenswrapper[4903]: E0320 08:23:35.112235 4903 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 20 08:23:35 crc kubenswrapper[4903]: I0320 08:23:35.395143 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:35Z is after 2026-02-23T05:33:13Z Mar 20 08:23:35 crc kubenswrapper[4903]: E0320 08:23:35.586945 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:23:35 crc kubenswrapper[4903]: I0320 08:23:35.746483 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 08:23:36 crc kubenswrapper[4903]: I0320 08:23:36.395267 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:36Z is after 2026-02-23T05:33:13Z Mar 20 08:23:36 crc kubenswrapper[4903]: E0320 08:23:36.995743 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:36Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 08:23:37 crc kubenswrapper[4903]: I0320 08:23:37.008962 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:37 crc kubenswrapper[4903]: I0320 08:23:37.010948 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:37 crc kubenswrapper[4903]: I0320 08:23:37.011006 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:37 crc kubenswrapper[4903]: I0320 08:23:37.011017 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:37 crc kubenswrapper[4903]: I0320 08:23:37.011066 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:23:37 crc kubenswrapper[4903]: E0320 08:23:37.014218 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:37Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 08:23:37 crc kubenswrapper[4903]: I0320 08:23:37.397847 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:37Z is after 2026-02-23T05:33:13Z Mar 20 08:23:38 crc kubenswrapper[4903]: I0320 08:23:38.009783 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:23:38 crc kubenswrapper[4903]: I0320 08:23:38.009967 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:38 crc kubenswrapper[4903]: I0320 08:23:38.011910 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:38 crc kubenswrapper[4903]: I0320 08:23:38.011993 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:38 crc kubenswrapper[4903]: I0320 08:23:38.012015 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:38 crc kubenswrapper[4903]: I0320 08:23:38.012945 4903 scope.go:117] "RemoveContainer" containerID="d03e0f1a252388ab5c9eceeca9ba587a37408ac375fd5c1f8d2aa550e63af428" Mar 20 08:23:38 crc kubenswrapper[4903]: E0320 08:23:38.013336 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:23:38 crc kubenswrapper[4903]: I0320 08:23:38.395526 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:38Z is after 2026-02-23T05:33:13Z Mar 20 08:23:39 crc kubenswrapper[4903]: I0320 08:23:39.267316 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:23:39 crc kubenswrapper[4903]: I0320 08:23:39.267564 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:39 crc kubenswrapper[4903]: I0320 08:23:39.269246 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:39 crc kubenswrapper[4903]: I0320 08:23:39.269313 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:39 crc kubenswrapper[4903]: I0320 08:23:39.269336 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:39 crc kubenswrapper[4903]: I0320 08:23:39.270640 4903 scope.go:117] "RemoveContainer" containerID="d03e0f1a252388ab5c9eceeca9ba587a37408ac375fd5c1f8d2aa550e63af428" Mar 20 08:23:39 crc kubenswrapper[4903]: E0320 08:23:39.270957 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:23:39 crc kubenswrapper[4903]: I0320 08:23:39.398429 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:39Z is after 2026-02-23T05:33:13Z Mar 20 08:23:39 crc kubenswrapper[4903]: E0320 08:23:39.601192 4903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e7f07f56b25b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,LastTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:40 crc kubenswrapper[4903]: I0320 08:23:40.397618 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:40Z is after 2026-02-23T05:33:13Z Mar 20 08:23:41 crc kubenswrapper[4903]: I0320 08:23:41.395006 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:41Z is after 2026-02-23T05:33:13Z Mar 20 08:23:42 crc kubenswrapper[4903]: I0320 08:23:42.395094 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:42Z is after 2026-02-23T05:33:13Z Mar 20 08:23:42 crc kubenswrapper[4903]: I0320 08:23:42.886139 4903 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:23:42 crc kubenswrapper[4903]: I0320 08:23:42.886220 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:23:43 crc kubenswrapper[4903]: I0320 08:23:43.395643 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:43Z is after 2026-02-23T05:33:13Z Mar 20 08:23:43 crc kubenswrapper[4903]: E0320 08:23:43.999053 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:43Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.014948 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.016686 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.016749 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.016763 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.016800 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:23:44 crc kubenswrapper[4903]: E0320 08:23:44.019498 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:44Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.394990 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:23:44Z is after 2026-02-23T05:33:13Z Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.745112 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.745294 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.746699 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.746753 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:44 crc kubenswrapper[4903]: I0320 08:23:44.746768 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:45 crc kubenswrapper[4903]: I0320 08:23:45.399435 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:45 crc kubenswrapper[4903]: E0320 08:23:45.587117 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:23:46 crc kubenswrapper[4903]: I0320 08:23:46.399360 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:47 crc kubenswrapper[4903]: I0320 08:23:47.396848 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:48 crc kubenswrapper[4903]: I0320 08:23:48.397918 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:49 crc kubenswrapper[4903]: I0320 08:23:49.400366 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.609541 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07f56b25b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,LastTimestamp:2026-03-20 08:22:55.38949266 +0000 UTC m=+0.606393015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.616224 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3bab11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470267153 +0000 UTC m=+0.687167508,LastTimestamp:2026-03-20 08:22:55.470267153 +0000 UTC m=+0.687167508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.624628 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d0fcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470358475 +0000 UTC m=+0.687258830,LastTimestamp:2026-03-20 08:22:55.470358475 +0000 UTC m=+0.687258830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.632407 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d5d10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470378256 +0000 UTC m=+0.687278611,LastTimestamp:2026-03-20 08:22:55.470378256 +0000 UTC m=+0.687278611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.637978 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f080030ab81 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.570209665 +0000 UTC m=+0.787110000,LastTimestamp:2026-03-20 08:22:55.570209665 +0000 UTC m=+0.787110000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.644059 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3bab11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3bab11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470267153 +0000 UTC m=+0.687167508,LastTimestamp:2026-03-20 08:22:55.593155229 +0000 UTC m=+0.810055554,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.650716 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d0fcb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d0fcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470358475 +0000 UTC m=+0.687258830,LastTimestamp:2026-03-20 08:22:55.593215982 +0000 UTC m=+0.810116307,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.655862 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d5d10\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d5d10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470378256 +0000 UTC m=+0.687278611,LastTimestamp:2026-03-20 08:22:55.593229622 +0000 UTC m=+0.810129947,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.663562 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3bab11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3bab11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470267153 +0000 UTC m=+0.687167508,LastTimestamp:2026-03-20 08:22:55.594941659 +0000 UTC m=+0.811841994,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.670990 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d0fcb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d0fcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470358475 +0000 UTC m=+0.687258830,LastTimestamp:2026-03-20 08:22:55.59495974 +0000 UTC m=+0.811860065,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.678822 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d5d10\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d5d10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470378256 +0000 UTC m=+0.687278611,LastTimestamp:2026-03-20 08:22:55.59497385 +0000 UTC m=+0.811874175,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.686391 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3bab11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3bab11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470267153 +0000 UTC m=+0.687167508,LastTimestamp:2026-03-20 08:22:55.596865482 +0000 UTC m=+0.813765817,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.693817 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d0fcb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d0fcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470358475 +0000 UTC m=+0.687258830,LastTimestamp:2026-03-20 08:22:55.596886563 +0000 UTC m=+0.813786898,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.700966 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d5d10\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d5d10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470378256 +0000 UTC m=+0.687278611,LastTimestamp:2026-03-20 08:22:55.596898873 +0000 UTC m=+0.813799198,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.708113 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3bab11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3bab11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470267153 +0000 UTC m=+0.687167508,LastTimestamp:2026-03-20 08:22:55.598133557 +0000 UTC m=+0.815033892,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.715816 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d0fcb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d0fcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470358475 +0000 UTC m=+0.687258830,LastTimestamp:2026-03-20 08:22:55.598149838 +0000 UTC m=+0.815050163,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.721555 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d5d10\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d5d10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470378256 +0000 UTC m=+0.687278611,LastTimestamp:2026-03-20 08:22:55.598161468 +0000 UTC m=+0.815061793,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.726753 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3bab11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3bab11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470267153 +0000 UTC m=+0.687167508,LastTimestamp:2026-03-20 08:22:55.598400415 +0000 UTC m=+0.815300750,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.733920 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d0fcb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d0fcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470358475 +0000 UTC m=+0.687258830,LastTimestamp:2026-03-20 08:22:55.598413315 +0000 UTC m=+0.815313640,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.740434 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d5d10\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d5d10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470378256 +0000 UTC m=+0.687278611,LastTimestamp:2026-03-20 08:22:55.598424885 +0000 UTC m=+0.815325210,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.745277 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3bab11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3bab11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470267153 +0000 UTC m=+0.687167508,LastTimestamp:2026-03-20 08:22:55.600513694 +0000 UTC m=+0.817414029,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.751612 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d0fcb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d0fcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470358475 +0000 UTC m=+0.687258830,LastTimestamp:2026-03-20 08:22:55.600527234 +0000 UTC m=+0.817427559,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.758403 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d5d10\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d5d10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470378256 +0000 UTC m=+0.687278611,LastTimestamp:2026-03-20 08:22:55.600538324 +0000 UTC m=+0.817438659,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.765283 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3bab11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3bab11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470267153 +0000 UTC m=+0.687167508,LastTimestamp:2026-03-20 08:22:55.601466219 +0000 UTC m=+0.818366584,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.772477 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e7f07fa3d0fcb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e7f07fa3d0fcb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.470358475 +0000 UTC m=+0.687258830,LastTimestamp:2026-03-20 08:22:55.601556742 +0000 UTC m=+0.818457107,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.779986 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7f081997028d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.996347021 +0000 UTC m=+1.213247376,LastTimestamp:2026-03-20 08:22:55.996347021 +0000 UTC m=+1.213247376,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.784583 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f08199827ae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:55.996422062 +0000 UTC m=+1.213322397,LastTimestamp:2026-03-20 08:22:55.996422062 +0000 UTC m=+1.213322397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.789456 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f081a0be7f0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.00400792 +0000 UTC m=+1.220908275,LastTimestamp:2026-03-20 08:22:56.00400792 +0000 UTC m=+1.220908275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.793923 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f081ab99b81 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.015391617 +0000 UTC m=+1.232291952,LastTimestamp:2026-03-20 08:22:56.015391617 +0000 UTC m=+1.232291952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.799427 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f081acb6f32 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.016559922 +0000 UTC m=+1.233460247,LastTimestamp:2026-03-20 08:22:56.016559922 +0000 UTC m=+1.233460247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.804817 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f08406895e2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.64761597 +0000 UTC m=+1.864516295,LastTimestamp:2026-03-20 08:22:56.64761597 +0000 UTC m=+1.864516295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.808862 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f08407029c0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.648112576 +0000 UTC m=+1.865012901,LastTimestamp:2026-03-20 08:22:56.648112576 +0000 UTC m=+1.865012901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.812847 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f0840761a47 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.648501831 +0000 UTC m=+1.865402146,LastTimestamp:2026-03-20 08:22:56.648501831 +0000 UTC m=+1.865402146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.818209 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f0840a47df0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.651542 +0000 UTC m=+1.868442315,LastTimestamp:2026-03-20 08:22:56.651542 +0000 UTC m=+1.868442315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.823078 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0840e9c6bb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.656082619 +0000 UTC m=+1.872982934,LastTimestamp:2026-03-20 08:22:56.656082619 +0000 UTC m=+1.872982934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.827068 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0840ffcde8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.657526248 +0000 UTC m=+1.874426563,LastTimestamp:2026-03-20 08:22:56.657526248 +0000 UTC m=+1.874426563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.830570 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f084109acb1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.658173105 +0000 UTC m=+1.875073420,LastTimestamp:2026-03-20 08:22:56.658173105 +0000 UTC m=+1.875073420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.834657 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7f08411f43ae openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.659588014 +0000 UTC m=+1.876488329,LastTimestamp:2026-03-20 08:22:56.659588014 +0000 UTC m=+1.876488329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.838561 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f084120f335 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.659698485 +0000 UTC m=+1.876598800,LastTimestamp:2026-03-20 08:22:56.659698485 +0000 UTC m=+1.876598800,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.845615 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f0841239fbf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.659873727 +0000 UTC m=+1.876774082,LastTimestamp:2026-03-20 08:22:56.659873727 +0000 UTC m=+1.876774082,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.849999 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7f08424bef06 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.679292678 +0000 UTC m=+1.896192993,LastTimestamp:2026-03-20 08:22:56.679292678 +0000 UTC m=+1.896192993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.854185 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0851b3805b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.937738331 +0000 UTC m=+2.154638646,LastTimestamp:2026-03-20 08:22:56.937738331 +0000 UTC m=+2.154638646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.857774 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0852355ecc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.94624942 +0000 UTC m=+2.163149735,LastTimestamp:2026-03-20 08:22:56.94624942 +0000 UTC m=+2.163149735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.862293 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0852425885 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.947099781 +0000 UTC m=+2.164000096,LastTimestamp:2026-03-20 08:22:56.947099781 +0000 UTC m=+2.164000096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.866852 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f085b7eb5db openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.102050779 +0000 UTC m=+2.318951104,LastTimestamp:2026-03-20 08:22:57.102050779 +0000 UTC m=+2.318951104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.872152 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f085c0d9d42 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.11141613 +0000 UTC m=+2.328316445,LastTimestamp:2026-03-20 08:22:57.11141613 +0000 UTC m=+2.328316445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.876816 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f085c1d1690 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.112430224 +0000 UTC m=+2.329330539,LastTimestamp:2026-03-20 08:22:57.112430224 +0000 UTC m=+2.329330539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.881577 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f08668a84ec openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.28737406 +0000 UTC m=+2.504274375,LastTimestamp:2026-03-20 08:22:57.28737406 +0000 UTC m=+2.504274375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.886741 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0866f89c86 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.294589062 +0000 UTC m=+2.511489377,LastTimestamp:2026-03-20 08:22:57.294589062 +0000 UTC m=+2.511489377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.891568 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f0873b23ab1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.508080305 +0000 UTC m=+2.724980640,LastTimestamp:2026-03-20 08:22:57.508080305 +0000 UTC m=+2.724980640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.896434 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f0873e8234d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.511613261 +0000 UTC m=+2.728513576,LastTimestamp:2026-03-20 08:22:57.511613261 +0000 UTC m=+2.728513576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.901406 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f0874783c90 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.521056912 +0000 UTC m=+2.737957247,LastTimestamp:2026-03-20 08:22:57.521056912 +0000 UTC m=+2.737957247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.905707 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7f08749bb4ea openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.523381482 +0000 UTC m=+2.740281797,LastTimestamp:2026-03-20 08:22:57.523381482 +0000 UTC m=+2.740281797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.910419 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f0882f1f371 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.763914609 +0000 UTC m=+2.980814924,LastTimestamp:2026-03-20 08:22:57.763914609 +0000 UTC m=+2.980814924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.915661 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f08830e0c68 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.765756008 +0000 UTC m=+2.982656323,LastTimestamp:2026-03-20 08:22:57.765756008 +0000 UTC m=+2.982656323,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.921719 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7f08831562bb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.766236859 +0000 UTC m=+2.983137174,LastTimestamp:2026-03-20 08:22:57.766236859 +0000 UTC m=+2.983137174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.926950 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f0883286f8d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.767485325 +0000 UTC m=+2.984385640,LastTimestamp:2026-03-20 08:22:57.767485325 +0000 UTC m=+2.984385640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.931358 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f0883ab6aac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.776069292 +0000 UTC m=+2.992969607,LastTimestamp:2026-03-20 08:22:57.776069292 +0000 UTC m=+2.992969607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.936263 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f0883c4a928 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.777723688 +0000 UTC m=+2.994623993,LastTimestamp:2026-03-20 08:22:57.777723688 +0000 UTC m=+2.994623993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.940751 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f0883c6750c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.77784142 +0000 UTC m=+2.994741735,LastTimestamp:2026-03-20 08:22:57.77784142 +0000 UTC m=+2.994741735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.948191 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f0883d51a6c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.77880126 +0000 UTC m=+2.995701575,LastTimestamp:2026-03-20 08:22:57.77880126 +0000 UTC m=+2.995701575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.955411 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e7f08840ae59f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.782326687 +0000 UTC m=+2.999227002,LastTimestamp:2026-03-20 08:22:57.782326687 +0000 UTC m=+2.999227002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.959099 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f0884235b8e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.783929742 +0000 UTC m=+3.000830057,LastTimestamp:2026-03-20 08:22:57.783929742 +0000 UTC m=+3.000830057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.962923 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f088fa3820a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.976877578 +0000 UTC m=+3.193777883,LastTimestamp:2026-03-20 08:22:57.976877578 +0000 UTC m=+3.193777883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.966190 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f088fc68fc0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.979174848 +0000 UTC m=+3.196075163,LastTimestamp:2026-03-20 08:22:57.979174848 +0000 UTC m=+3.196075163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.969830 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f0890374f1c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.986563868 +0000 UTC m=+3.203464183,LastTimestamp:2026-03-20 08:22:57.986563868 +0000 UTC m=+3.203464183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.972360 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f089054fb4a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.98850849 +0000 UTC m=+3.205408815,LastTimestamp:2026-03-20 08:22:57.98850849 +0000 UTC m=+3.205408815,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.975910 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f0890862d1e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.99173251 +0000 UTC m=+3.208632825,LastTimestamp:2026-03-20 08:22:57.99173251 +0000 UTC m=+3.208632825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.979508 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f0890b084d8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:57.99450748 +0000 UTC m=+3.211407795,LastTimestamp:2026-03-20 08:22:57.99450748 +0000 UTC m=+3.211407795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.983184 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f089c2da266 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.187256422 +0000 UTC m=+3.404156737,LastTimestamp:2026-03-20 08:22:58.187256422 +0000 UTC m=+3.404156737,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.986562 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f089c4e9e73 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.189418099 +0000 UTC m=+3.406318404,LastTimestamp:2026-03-20 08:22:58.189418099 +0000 UTC m=+3.406318404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.990424 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f089cf4d429 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.200310825 +0000 UTC m=+3.417211140,LastTimestamp:2026-03-20 08:22:58.200310825 +0000 UTC m=+3.417211140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.994906 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f089d077ea9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.201534121 +0000 UTC m=+3.418434436,LastTimestamp:2026-03-20 08:22:58.201534121 +0000 UTC m=+3.418434436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:49 crc kubenswrapper[4903]: E0320 08:23:49.999020 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e7f089d2843f1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.203681777 +0000 UTC m=+3.420582092,LastTimestamp:2026-03-20 08:22:58.203681777 +0000 UTC m=+3.420582092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.004013 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f08a834d9af openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.389055919 +0000 UTC m=+3.605956234,LastTimestamp:2026-03-20 08:22:58.389055919 +0000 UTC m=+3.605956234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.008792 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f08a8f3e36f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.401575791 +0000 UTC m=+3.618476106,LastTimestamp:2026-03-20 08:22:58.401575791 +0000 UTC m=+3.618476106,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.012614 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f08a906d629 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.402817577 +0000 UTC m=+3.619717902,LastTimestamp:2026-03-20 08:22:58.402817577 +0000 UTC m=+3.619717902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.017646 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f08b08871bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.528752063 +0000 UTC m=+3.745652388,LastTimestamp:2026-03-20 08:22:58.528752063 +0000 UTC m=+3.745652388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.022126 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f08b63b4c9f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.624359583 +0000 UTC m=+3.841259898,LastTimestamp:2026-03-20 08:22:58.624359583 +0000 UTC m=+3.841259898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.025949 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f08b77251f4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.644742644 +0000 UTC m=+3.861642959,LastTimestamp:2026-03-20 08:22:58.644742644 +0000 UTC m=+3.861642959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.030251 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f08bd68ee8d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.744790669 +0000 UTC m=+3.961690984,LastTimestamp:2026-03-20 08:22:58.744790669 +0000 UTC m=+3.961690984,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.033973 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f08be432130 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:58.75909048 +0000 UTC m=+3.975990795,LastTimestamp:2026-03-20 08:22:58.75909048 +0000 UTC m=+3.975990795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.038387 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f08eddffc37 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:59.557899319 +0000 UTC m=+4.774799674,LastTimestamp:2026-03-20 08:22:59.557899319 +0000 UTC m=+4.774799674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.041476 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f08f9febbee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:59.76124107 +0000 UTC m=+4.978141385,LastTimestamp:2026-03-20 08:22:59.76124107 +0000 UTC m=+4.978141385,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.044734 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f08fa8b2c14 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:59.77044482 +0000 UTC m=+4.987345135,LastTimestamp:2026-03-20 08:22:59.77044482 +0000 UTC m=+4.987345135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.047846 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f08fa992aee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:59.77136203 +0000 UTC m=+4.988262345,LastTimestamp:2026-03-20 08:22:59.77136203 +0000 UTC m=+4.988262345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.050988 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f090735f41c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:59.98296374 +0000 UTC m=+5.199864055,LastTimestamp:2026-03-20 08:22:59.98296374 +0000 UTC m=+5.199864055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.054371 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f0907e045b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:59.994125752 +0000 UTC m=+5.211026067,LastTimestamp:2026-03-20 08:22:59.994125752 +0000 UTC m=+5.211026067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.057559 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f0907f2700e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:59.995316238 +0000 UTC m=+5.212216593,LastTimestamp:2026-03-20 08:22:59.995316238 +0000 UTC m=+5.212216593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.060993 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f0913f4b6dc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:00.196792028 +0000 UTC m=+5.413692343,LastTimestamp:2026-03-20 08:23:00.196792028 +0000 UTC m=+5.413692343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.064151 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f0914985fc8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:00.20751764 +0000 UTC m=+5.424417955,LastTimestamp:2026-03-20 08:23:00.20751764 +0000 UTC m=+5.424417955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.067285 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f0914c0615f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:00.210139487 +0000 UTC m=+5.427040032,LastTimestamp:2026-03-20 08:23:00.210139487 +0000 UTC m=+5.427040032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.070587 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f09226b5417 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:00.439446551 +0000 UTC m=+5.656346866,LastTimestamp:2026-03-20 08:23:00.439446551 +0000 UTC m=+5.656346866,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.073673 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f0923656d75 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:00.455837045 +0000 UTC m=+5.672737360,LastTimestamp:2026-03-20 08:23:00.455837045 +0000 UTC m=+5.672737360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.077322 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f09237d55fc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:00.4574039 +0000 UTC m=+5.674304215,LastTimestamp:2026-03-20 08:23:00.4574039 +0000 UTC m=+5.674304215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.080862 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f0930f3a93c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:00.683262268 +0000 UTC m=+5.900162583,LastTimestamp:2026-03-20 08:23:00.683262268 +0000 UTC m=+5.900162583,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.084408 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e7f09319684b9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:00.693935289 +0000 UTC m=+5.910835604,LastTimestamp:2026-03-20 08:23:00.693935289 +0000 UTC m=+5.910835604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.088499 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 08:23:50 crc kubenswrapper[4903]: &Event{ObjectMeta:{kube-controller-manager-crc.189e7f09b43ce992 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 08:23:50 crc kubenswrapper[4903]: body: Mar 20 08:23:50 crc kubenswrapper[4903]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:02.885878162 +0000 UTC m=+8.102778517,LastTimestamp:2026-03-20 08:23:02.885878162 +0000 UTC m=+8.102778517,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 08:23:50 crc kubenswrapper[4903]: > Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.092376 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f09b43e9097 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:02.885986455 +0000 UTC m=+8.102886810,LastTimestamp:2026-03-20 08:23:02.885986455 +0000 UTC m=+8.102886810,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.096914 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 08:23:50 crc kubenswrapper[4903]: &Event{ObjectMeta:{kube-apiserver-crc.189e7f0b2d140407 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:39018->192.168.126.11:17697: read: connection reset by peer Mar 20 08:23:50 crc kubenswrapper[4903]: body: Mar 20 08:23:50 crc kubenswrapper[4903]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:09.208208391 +0000 UTC m=+14.425108706,LastTimestamp:2026-03-20 08:23:09.208208391 +0000 UTC m=+14.425108706,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 08:23:50 crc kubenswrapper[4903]: > Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.100504 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f0b2d14c55a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39018->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:09.208257882 +0000 UTC m=+14.425158197,LastTimestamp:2026-03-20 08:23:09.208257882 +0000 UTC m=+14.425158197,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.104848 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 08:23:50 crc kubenswrapper[4903]: &Event{ObjectMeta:{kube-apiserver-crc.189e7f0b309fd38c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 20 08:23:50 crc kubenswrapper[4903]: body: Mar 20 08:23:50 crc kubenswrapper[4903]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:09.267702668 +0000 UTC m=+14.484603023,LastTimestamp:2026-03-20 08:23:09.267702668 +0000 UTC m=+14.484603023,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 08:23:50 crc kubenswrapper[4903]: > Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.108946 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f0b30a10bd2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:09.26778261 +0000 UTC m=+14.484682955,LastTimestamp:2026-03-20 08:23:09.26778261 +0000 UTC m=+14.484682955,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.112937 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 08:23:50 crc kubenswrapper[4903]: &Event{ObjectMeta:{kube-apiserver-crc.189e7f0b441e92b8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 08:23:50 crc kubenswrapper[4903]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 08:23:50 crc kubenswrapper[4903]: Mar 20 08:23:50 crc kubenswrapper[4903]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:09.594776248 +0000 UTC m=+14.811676573,LastTimestamp:2026-03-20 08:23:09.594776248 +0000 UTC m=+14.811676573,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 08:23:50 crc kubenswrapper[4903]: > Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.116180 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e7f0b441f5109 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:09.594824969 +0000 UTC m=+14.811725294,LastTimestamp:2026-03-20 08:23:09.594824969 +0000 UTC m=+14.811725294,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.120013 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e7f0b441e92b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 08:23:50 crc kubenswrapper[4903]: &Event{ObjectMeta:{kube-apiserver-crc.189e7f0b441e92b8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 08:23:50 crc kubenswrapper[4903]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 08:23:50 crc kubenswrapper[4903]: Mar 20 08:23:50 crc kubenswrapper[4903]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:09.594776248 +0000 UTC m=+14.811676573,LastTimestamp:2026-03-20 08:23:09.601251877 +0000 UTC m=+14.818152192,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 08:23:50 crc kubenswrapper[4903]: > Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.124098 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 08:23:50 crc kubenswrapper[4903]: &Event{ObjectMeta:{kube-controller-manager-crc.189e7f0c0845843a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 08:23:50 crc kubenswrapper[4903]: body: Mar 20 08:23:50 crc kubenswrapper[4903]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:12.885662778 +0000 UTC m=+18.102563093,LastTimestamp:2026-03-20 08:23:12.885662778 +0000 UTC m=+18.102563093,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 08:23:50 crc kubenswrapper[4903]: > Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.127481 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0c08466711 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:12.885720849 +0000 UTC m=+18.102621164,LastTimestamp:2026-03-20 08:23:12.885720849 +0000 UTC m=+18.102621164,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.132048 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7f0c0845843a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 08:23:50 crc kubenswrapper[4903]: &Event{ObjectMeta:{kube-controller-manager-crc.189e7f0c0845843a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 08:23:50 crc kubenswrapper[4903]: body: Mar 20 08:23:50 crc kubenswrapper[4903]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:12.885662778 +0000 UTC m=+18.102563093,LastTimestamp:2026-03-20 08:23:22.886296004 +0000 UTC m=+28.103196359,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 08:23:50 crc kubenswrapper[4903]: > Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.135689 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7f0c08466711\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0c08466711 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:12.885720849 +0000 UTC m=+18.102621164,LastTimestamp:2026-03-20 08:23:22.886378666 +0000 UTC m=+28.103278991,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.139375 4903 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0e5c883624 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:22.889254436 +0000 UTC m=+28.106154791,LastTimestamp:2026-03-20 08:23:22.889254436 +0000 UTC m=+28.106154791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.142951 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7f0840ffcde8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0840ffcde8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.657526248 +0000 UTC m=+1.874426563,LastTimestamp:2026-03-20 08:23:23.008899384 +0000 UTC m=+28.225799739,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.146460 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7f0851b3805b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0851b3805b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.937738331 +0000 UTC m=+2.154638646,LastTimestamp:2026-03-20 08:23:23.220579611 +0000 UTC m=+28.437479966,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.150506 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7f0852355ecc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0852355ecc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:22:56.94624942 +0000 UTC m=+2.163149735,LastTimestamp:2026-03-20 08:23:23.248068377 +0000 UTC m=+28.464968692,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.155238 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7f0c0845843a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 08:23:50 crc kubenswrapper[4903]: &Event{ObjectMeta:{kube-controller-manager-crc.189e7f0c0845843a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 08:23:50 crc kubenswrapper[4903]: body: Mar 20 08:23:50 crc kubenswrapper[4903]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:12.885662778 +0000 UTC m=+18.102563093,LastTimestamp:2026-03-20 08:23:32.885854886 +0000 UTC m=+38.102755201,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 08:23:50 crc kubenswrapper[4903]: > Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.159359 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7f0c08466711\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e7f0c08466711 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:12.885720849 +0000 UTC m=+18.102621164,LastTimestamp:2026-03-20 08:23:32.885898438 +0000 UTC m=+38.102798753,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.165593 4903 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e7f0c0845843a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 08:23:50 crc kubenswrapper[4903]: &Event{ObjectMeta:{kube-controller-manager-crc.189e7f0c0845843a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 08:23:50 crc kubenswrapper[4903]: body: Mar 20 08:23:50 crc kubenswrapper[4903]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:23:12.885662778 +0000 UTC m=+18.102563093,LastTimestamp:2026-03-20 08:23:42.886202982 +0000 UTC m=+48.103103297,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 08:23:50 crc kubenswrapper[4903]: > Mar 20 08:23:50 crc kubenswrapper[4903]: W0320 08:23:50.309462 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.309548 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 08:23:50 crc kubenswrapper[4903]: I0320 08:23:50.395455 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:50 crc kubenswrapper[4903]: I0320 08:23:50.490069 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:50 crc kubenswrapper[4903]: I0320 08:23:50.491376 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:50 crc kubenswrapper[4903]: I0320 08:23:50.491422 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:50 crc kubenswrapper[4903]: I0320 08:23:50.491433 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:50 crc kubenswrapper[4903]: I0320 08:23:50.492057 4903 scope.go:117] "RemoveContainer" containerID="d03e0f1a252388ab5c9eceeca9ba587a37408ac375fd5c1f8d2aa550e63af428" Mar 20 08:23:50 crc kubenswrapper[4903]: E0320 08:23:50.492224 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:23:51 crc kubenswrapper[4903]: E0320 08:23:51.007363 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 08:23:51 crc kubenswrapper[4903]: I0320 08:23:51.019783 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:51 crc kubenswrapper[4903]: I0320 08:23:51.021222 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:51 crc kubenswrapper[4903]: I0320 08:23:51.021272 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:51 crc kubenswrapper[4903]: I0320 08:23:51.021293 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:51 crc kubenswrapper[4903]: I0320 08:23:51.021324 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:23:51 crc kubenswrapper[4903]: E0320 08:23:51.022476 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 08:23:51 crc kubenswrapper[4903]: I0320 08:23:51.397703 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:52 crc kubenswrapper[4903]: I0320 08:23:52.396760 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:52 crc kubenswrapper[4903]: I0320 08:23:52.885931 4903 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:23:52 crc kubenswrapper[4903]: I0320 08:23:52.886056 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:23:52 crc kubenswrapper[4903]: I0320 08:23:52.886126 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:23:52 crc kubenswrapper[4903]: I0320 08:23:52.886333 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:52 crc kubenswrapper[4903]: I0320 08:23:52.887969 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:52 crc kubenswrapper[4903]: I0320 08:23:52.888061 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:52 crc kubenswrapper[4903]: I0320 08:23:52.888077 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:52 crc kubenswrapper[4903]: I0320 08:23:52.888596 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"c1bee1ec27b18be528bfc7b577684c9d6e17f7cf57e902ff00c9f94f1f3a609e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 08:23:52 crc kubenswrapper[4903]: I0320 08:23:52.888705 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://c1bee1ec27b18be528bfc7b577684c9d6e17f7cf57e902ff00c9f94f1f3a609e" gracePeriod=30 Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.397898 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.801723 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.803180 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.803697 4903 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c1bee1ec27b18be528bfc7b577684c9d6e17f7cf57e902ff00c9f94f1f3a609e" exitCode=255 Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.803745 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c1bee1ec27b18be528bfc7b577684c9d6e17f7cf57e902ff00c9f94f1f3a609e"} Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.803794 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"244973be84b59058f68be9755f109463af6a77c3bf56445ac005260ddc5aa794"} Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.803812 4903 scope.go:117] "RemoveContainer" containerID="ae75429275f0159b6c1185aa41107dac0937f87a905e7218b2cbfa9080f692b2" Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.803973 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.805486 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.805522 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:53 crc kubenswrapper[4903]: I0320 08:23:53.805530 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:54 crc kubenswrapper[4903]: I0320 08:23:54.398170 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:54 crc kubenswrapper[4903]: I0320 08:23:54.809578 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 08:23:55 crc kubenswrapper[4903]: I0320 08:23:55.399153 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:55 crc kubenswrapper[4903]: E0320 08:23:55.587342 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:23:56 crc kubenswrapper[4903]: I0320 08:23:56.396559 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:57 crc kubenswrapper[4903]: I0320 08:23:57.396572 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:58 crc kubenswrapper[4903]: E0320 08:23:58.015277 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 08:23:58 crc kubenswrapper[4903]: I0320 08:23:58.023369 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:58 crc kubenswrapper[4903]: I0320 08:23:58.025240 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:58 crc kubenswrapper[4903]: I0320 08:23:58.025307 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:58 crc kubenswrapper[4903]: I0320 08:23:58.025321 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:58 crc kubenswrapper[4903]: I0320 08:23:58.025363 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:23:58 crc kubenswrapper[4903]: E0320 08:23:58.032313 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 08:23:58 crc kubenswrapper[4903]: I0320 08:23:58.397340 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:59 crc kubenswrapper[4903]: I0320 08:23:59.397733 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:23:59 crc kubenswrapper[4903]: I0320 08:23:59.884845 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:23:59 crc kubenswrapper[4903]: I0320 08:23:59.885096 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:23:59 crc kubenswrapper[4903]: I0320 08:23:59.887735 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:23:59 crc kubenswrapper[4903]: I0320 08:23:59.887793 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:23:59 crc kubenswrapper[4903]: I0320 08:23:59.887804 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:23:59 crc kubenswrapper[4903]: I0320 08:23:59.889056 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:24:00 crc kubenswrapper[4903]: I0320 08:24:00.395424 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:00 crc kubenswrapper[4903]: I0320 08:24:00.638281 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:24:00 crc kubenswrapper[4903]: I0320 08:24:00.827842 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:00 crc kubenswrapper[4903]: I0320 08:24:00.828849 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:00 crc kubenswrapper[4903]: I0320 08:24:00.828879 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:00 crc kubenswrapper[4903]: I0320 08:24:00.828889 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:01 crc kubenswrapper[4903]: I0320 08:24:01.395318 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:01 crc kubenswrapper[4903]: I0320 08:24:01.830578 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:01 crc kubenswrapper[4903]: I0320 08:24:01.831736 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:01 crc kubenswrapper[4903]: I0320 08:24:01.831777 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:01 crc kubenswrapper[4903]: I0320 08:24:01.831786 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:02 crc kubenswrapper[4903]: I0320 08:24:02.398506 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:02 crc kubenswrapper[4903]: W0320 08:24:02.668666 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 08:24:02 crc kubenswrapper[4903]: E0320 08:24:02.668727 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 08:24:03 crc kubenswrapper[4903]: W0320 08:24:03.164957 4903 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 08:24:03 crc kubenswrapper[4903]: E0320 08:24:03.165101 4903 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.398167 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.490340 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.491691 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.491764 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.491781 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.492804 4903 scope.go:117] "RemoveContainer" containerID="d03e0f1a252388ab5c9eceeca9ba587a37408ac375fd5c1f8d2aa550e63af428" Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.837752 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.840143 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"18e2da8edfca6135b75a943a7fe0a7954decfbea090fffa730262edb294ae4fa"} Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.840302 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.841678 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.841724 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:03 crc kubenswrapper[4903]: I0320 08:24:03.841738 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.399462 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.846841 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.847813 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.849931 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="18e2da8edfca6135b75a943a7fe0a7954decfbea090fffa730262edb294ae4fa" exitCode=255 Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.849971 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"18e2da8edfca6135b75a943a7fe0a7954decfbea090fffa730262edb294ae4fa"} Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.850007 4903 scope.go:117] "RemoveContainer" containerID="d03e0f1a252388ab5c9eceeca9ba587a37408ac375fd5c1f8d2aa550e63af428" Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.850183 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.851256 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.851286 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.851297 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:04 crc kubenswrapper[4903]: I0320 08:24:04.851840 4903 scope.go:117] "RemoveContainer" containerID="18e2da8edfca6135b75a943a7fe0a7954decfbea090fffa730262edb294ae4fa" Mar 20 08:24:04 crc kubenswrapper[4903]: E0320 08:24:04.852014 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:24:05 crc kubenswrapper[4903]: E0320 08:24:05.020409 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 08:24:05 crc kubenswrapper[4903]: I0320 08:24:05.033431 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:05 crc kubenswrapper[4903]: I0320 08:24:05.034921 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:05 crc kubenswrapper[4903]: I0320 08:24:05.034986 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:05 crc kubenswrapper[4903]: I0320 08:24:05.035003 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:05 crc kubenswrapper[4903]: I0320 08:24:05.035064 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:24:05 crc kubenswrapper[4903]: E0320 08:24:05.042195 4903 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 08:24:05 crc kubenswrapper[4903]: I0320 08:24:05.396286 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:05 crc kubenswrapper[4903]: E0320 08:24:05.587513 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:24:05 crc kubenswrapper[4903]: I0320 08:24:05.855011 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 08:24:06 crc kubenswrapper[4903]: I0320 08:24:06.400261 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:07 crc kubenswrapper[4903]: I0320 08:24:07.114198 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:24:07 crc kubenswrapper[4903]: I0320 08:24:07.131376 4903 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 08:24:07 crc kubenswrapper[4903]: I0320 08:24:07.401257 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:08 crc kubenswrapper[4903]: I0320 08:24:08.009777 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:24:08 crc kubenswrapper[4903]: I0320 08:24:08.009989 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:08 crc kubenswrapper[4903]: I0320 08:24:08.011554 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:08 crc kubenswrapper[4903]: I0320 08:24:08.011622 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:08 crc kubenswrapper[4903]: I0320 08:24:08.011643 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:08 crc kubenswrapper[4903]: I0320 08:24:08.012562 4903 scope.go:117] "RemoveContainer" containerID="18e2da8edfca6135b75a943a7fe0a7954decfbea090fffa730262edb294ae4fa" Mar 20 08:24:08 crc kubenswrapper[4903]: E0320 08:24:08.012875 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:24:08 crc kubenswrapper[4903]: I0320 08:24:08.399304 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:09 crc kubenswrapper[4903]: I0320 08:24:09.267776 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:24:09 crc kubenswrapper[4903]: I0320 08:24:09.268125 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:09 crc kubenswrapper[4903]: I0320 08:24:09.269623 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:09 crc kubenswrapper[4903]: I0320 08:24:09.269684 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:09 crc kubenswrapper[4903]: I0320 08:24:09.269700 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:09 crc kubenswrapper[4903]: I0320 08:24:09.270476 4903 scope.go:117] "RemoveContainer" containerID="18e2da8edfca6135b75a943a7fe0a7954decfbea090fffa730262edb294ae4fa" Mar 20 08:24:09 crc kubenswrapper[4903]: E0320 08:24:09.270984 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:24:09 crc kubenswrapper[4903]: I0320 08:24:09.400137 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.395512 4903 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.472643 4903 csr.go:261] certificate signing request csr-bsc5h is approved, waiting to be issued Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.483503 4903 csr.go:257] certificate signing request csr-bsc5h is issued Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.489852 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.491417 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.491530 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.491553 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.502317 4903 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.641206 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.641346 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.642446 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.642477 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:10 crc kubenswrapper[4903]: I0320 08:24:10.642486 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:11 crc kubenswrapper[4903]: I0320 08:24:11.233587 4903 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 08:24:11 crc kubenswrapper[4903]: I0320 08:24:11.485608 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 17:43:55.27573676 +0000 UTC Mar 20 08:24:11 crc kubenswrapper[4903]: I0320 08:24:11.485685 4903 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6201h19m43.790058765s for next certificate rotation Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.042983 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.045125 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.045187 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.045206 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.045385 4903 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.056356 4903 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.056735 4903 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.056769 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.061186 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.061234 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.061254 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.061284 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.061308 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:12Z","lastTransitionTime":"2026-03-20T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.082494 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.091541 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.091592 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.091609 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.091632 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.091649 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:12Z","lastTransitionTime":"2026-03-20T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.108373 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.120575 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.120628 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.120647 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.120672 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.120690 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:12Z","lastTransitionTime":"2026-03-20T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.139117 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.150674 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.150742 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.150767 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.150796 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:12 crc kubenswrapper[4903]: I0320 08:24:12.150821 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:12Z","lastTransitionTime":"2026-03-20T08:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.166602 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.166824 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.166874 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.267559 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.368719 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.469251 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.570095 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.671148 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.772293 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.872957 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:12 crc kubenswrapper[4903]: E0320 08:24:12.973180 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:13 crc kubenswrapper[4903]: I0320 08:24:13.020914 4903 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 08:24:13 crc kubenswrapper[4903]: E0320 08:24:13.073549 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:13 crc kubenswrapper[4903]: E0320 08:24:13.174721 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:13 crc kubenswrapper[4903]: E0320 08:24:13.275915 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:13 crc kubenswrapper[4903]: E0320 08:24:13.376898 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:13 crc kubenswrapper[4903]: E0320 08:24:13.477696 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:13 crc kubenswrapper[4903]: E0320 08:24:13.578630 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:13 crc kubenswrapper[4903]: E0320 08:24:13.679685 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:13 crc kubenswrapper[4903]: E0320 08:24:13.780901 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:13 crc kubenswrapper[4903]: E0320 08:24:13.881019 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:13 crc kubenswrapper[4903]: E0320 08:24:13.981661 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:14 crc kubenswrapper[4903]: E0320 08:24:14.082628 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:14 crc kubenswrapper[4903]: E0320 08:24:14.183228 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:14 crc kubenswrapper[4903]: E0320 08:24:14.283699 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:14 crc kubenswrapper[4903]: E0320 08:24:14.383828 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:14 crc kubenswrapper[4903]: E0320 08:24:14.484011 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:14 crc kubenswrapper[4903]: E0320 08:24:14.584225 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:14 crc kubenswrapper[4903]: E0320 08:24:14.685203 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:14 crc kubenswrapper[4903]: E0320 08:24:14.785550 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:14 crc kubenswrapper[4903]: E0320 08:24:14.886695 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:14 crc kubenswrapper[4903]: E0320 08:24:14.987720 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.088277 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.189315 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.289957 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.390086 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.490737 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.587624 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.590844 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.691509 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.792206 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.892944 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:15 crc kubenswrapper[4903]: E0320 08:24:15.993548 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:16 crc kubenswrapper[4903]: E0320 08:24:16.094501 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:16 crc kubenswrapper[4903]: E0320 08:24:16.195615 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:16 crc kubenswrapper[4903]: E0320 08:24:16.295998 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:16 crc kubenswrapper[4903]: E0320 08:24:16.396754 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:16 crc kubenswrapper[4903]: E0320 08:24:16.497667 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:16 crc kubenswrapper[4903]: E0320 08:24:16.598463 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:16 crc kubenswrapper[4903]: E0320 08:24:16.698771 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:16 crc kubenswrapper[4903]: E0320 08:24:16.799352 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:16 crc kubenswrapper[4903]: E0320 08:24:16.899926 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:17 crc kubenswrapper[4903]: E0320 08:24:17.001112 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:17 crc kubenswrapper[4903]: E0320 08:24:17.101726 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:17 crc kubenswrapper[4903]: E0320 08:24:17.202131 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:17 crc kubenswrapper[4903]: E0320 08:24:17.302290 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:17 crc kubenswrapper[4903]: E0320 08:24:17.403333 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:17 crc kubenswrapper[4903]: E0320 08:24:17.504150 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:17 crc kubenswrapper[4903]: E0320 08:24:17.604900 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:17 crc kubenswrapper[4903]: E0320 08:24:17.705133 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:17 crc kubenswrapper[4903]: E0320 08:24:17.805964 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:17 crc kubenswrapper[4903]: E0320 08:24:17.907203 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:18 crc kubenswrapper[4903]: E0320 08:24:18.008120 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:18 crc kubenswrapper[4903]: E0320 08:24:18.108301 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:18 crc kubenswrapper[4903]: E0320 08:24:18.209379 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:18 crc kubenswrapper[4903]: E0320 08:24:18.309925 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:18 crc kubenswrapper[4903]: E0320 08:24:18.410095 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:18 crc kubenswrapper[4903]: E0320 08:24:18.510750 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:18 crc kubenswrapper[4903]: E0320 08:24:18.610971 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:18 crc kubenswrapper[4903]: E0320 08:24:18.711742 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:18 crc kubenswrapper[4903]: E0320 08:24:18.812396 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:18 crc kubenswrapper[4903]: E0320 08:24:18.913456 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:19 crc kubenswrapper[4903]: E0320 08:24:19.014405 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:19 crc kubenswrapper[4903]: E0320 08:24:19.114573 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:19 crc kubenswrapper[4903]: E0320 08:24:19.215030 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:19 crc kubenswrapper[4903]: E0320 08:24:19.315940 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:19 crc kubenswrapper[4903]: E0320 08:24:19.416866 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:19 crc kubenswrapper[4903]: E0320 08:24:19.518108 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:19 crc kubenswrapper[4903]: E0320 08:24:19.618427 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:19 crc kubenswrapper[4903]: E0320 08:24:19.719138 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:19 crc kubenswrapper[4903]: E0320 08:24:19.820309 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:19 crc kubenswrapper[4903]: E0320 08:24:19.921094 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:20 crc kubenswrapper[4903]: E0320 08:24:20.021937 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:20 crc kubenswrapper[4903]: E0320 08:24:20.122301 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:20 crc kubenswrapper[4903]: E0320 08:24:20.222444 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:20 crc kubenswrapper[4903]: E0320 08:24:20.323570 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:20 crc kubenswrapper[4903]: E0320 08:24:20.424582 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:20 crc kubenswrapper[4903]: E0320 08:24:20.525568 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:20 crc kubenswrapper[4903]: E0320 08:24:20.626552 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:20 crc kubenswrapper[4903]: E0320 08:24:20.727106 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:20 crc kubenswrapper[4903]: E0320 08:24:20.827855 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:20 crc kubenswrapper[4903]: E0320 08:24:20.928927 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:21 crc kubenswrapper[4903]: E0320 08:24:21.030141 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:21 crc kubenswrapper[4903]: E0320 08:24:21.131427 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:21 crc kubenswrapper[4903]: E0320 08:24:21.232675 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:21 crc kubenswrapper[4903]: E0320 08:24:21.333255 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:21 crc kubenswrapper[4903]: E0320 08:24:21.434091 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:21 crc kubenswrapper[4903]: E0320 08:24:21.534609 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:21 crc kubenswrapper[4903]: E0320 08:24:21.636063 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:21 crc kubenswrapper[4903]: E0320 08:24:21.737231 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:21 crc kubenswrapper[4903]: E0320 08:24:21.838542 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:21 crc kubenswrapper[4903]: E0320 08:24:21.938996 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.039959 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.140951 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.241449 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.342120 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.408789 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.417402 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.417650 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.417895 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.418146 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.418663 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:22Z","lastTransitionTime":"2026-03-20T08:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.435276 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.441660 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.441891 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.442095 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.442251 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.442399 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:22Z","lastTransitionTime":"2026-03-20T08:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.460133 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.466457 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.466543 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.466571 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.466611 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.466636 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:22Z","lastTransitionTime":"2026-03-20T08:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.480402 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.486498 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.486586 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.486624 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.486665 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.486694 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:22Z","lastTransitionTime":"2026-03-20T08:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.490292 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.492436 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.492527 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.492582 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:22 crc kubenswrapper[4903]: I0320 08:24:22.493466 4903 scope.go:117] "RemoveContainer" containerID="18e2da8edfca6135b75a943a7fe0a7954decfbea090fffa730262edb294ae4fa" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.493728 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.505388 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.505641 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.505682 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.606342 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.707180 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.807649 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:22 crc kubenswrapper[4903]: E0320 08:24:22.908545 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:23 crc kubenswrapper[4903]: E0320 08:24:23.009334 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:23 crc kubenswrapper[4903]: E0320 08:24:23.110384 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:23 crc kubenswrapper[4903]: E0320 08:24:23.210503 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:23 crc kubenswrapper[4903]: E0320 08:24:23.310873 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:23 crc kubenswrapper[4903]: E0320 08:24:23.412268 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:23 crc kubenswrapper[4903]: E0320 08:24:23.513591 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:23 crc kubenswrapper[4903]: E0320 08:24:23.613796 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:23 crc kubenswrapper[4903]: E0320 08:24:23.715680 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:23 crc kubenswrapper[4903]: E0320 08:24:23.816830 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:23 crc kubenswrapper[4903]: E0320 08:24:23.917770 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:24 crc kubenswrapper[4903]: E0320 08:24:24.018223 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:24 crc kubenswrapper[4903]: E0320 08:24:24.118674 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:24 crc kubenswrapper[4903]: E0320 08:24:24.219916 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:24 crc kubenswrapper[4903]: E0320 08:24:24.320461 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:24 crc kubenswrapper[4903]: E0320 08:24:24.421310 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:24 crc kubenswrapper[4903]: E0320 08:24:24.521491 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:24 crc kubenswrapper[4903]: E0320 08:24:24.622174 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:24 crc kubenswrapper[4903]: E0320 08:24:24.723636 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:24 crc kubenswrapper[4903]: E0320 08:24:24.824624 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:24 crc kubenswrapper[4903]: E0320 08:24:24.925814 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.026779 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.127856 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.228194 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.328815 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.429890 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.530633 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.587889 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.631617 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.732777 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.833611 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:25 crc kubenswrapper[4903]: E0320 08:24:25.934319 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:26 crc kubenswrapper[4903]: E0320 08:24:26.035607 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:26 crc kubenswrapper[4903]: E0320 08:24:26.135752 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:26 crc kubenswrapper[4903]: E0320 08:24:26.236376 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:26 crc kubenswrapper[4903]: E0320 08:24:26.337519 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:26 crc kubenswrapper[4903]: E0320 08:24:26.438317 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:26 crc kubenswrapper[4903]: E0320 08:24:26.538716 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:26 crc kubenswrapper[4903]: E0320 08:24:26.639518 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:26 crc kubenswrapper[4903]: E0320 08:24:26.740759 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:26 crc kubenswrapper[4903]: E0320 08:24:26.841414 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:26 crc kubenswrapper[4903]: I0320 08:24:26.855189 4903 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 08:24:26 crc kubenswrapper[4903]: E0320 08:24:26.941700 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:27 crc kubenswrapper[4903]: E0320 08:24:27.042639 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:27 crc kubenswrapper[4903]: E0320 08:24:27.143704 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:27 crc kubenswrapper[4903]: E0320 08:24:27.244495 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:27 crc kubenswrapper[4903]: E0320 08:24:27.345538 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:27 crc kubenswrapper[4903]: E0320 08:24:27.446959 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:27 crc kubenswrapper[4903]: E0320 08:24:27.548504 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:27 crc kubenswrapper[4903]: E0320 08:24:27.648911 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:27 crc kubenswrapper[4903]: E0320 08:24:27.749856 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:27 crc kubenswrapper[4903]: E0320 08:24:27.850167 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:27 crc kubenswrapper[4903]: E0320 08:24:27.950916 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:28 crc kubenswrapper[4903]: E0320 08:24:28.051340 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:28 crc kubenswrapper[4903]: E0320 08:24:28.152187 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:28 crc kubenswrapper[4903]: E0320 08:24:28.253169 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:28 crc kubenswrapper[4903]: E0320 08:24:28.353744 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:28 crc kubenswrapper[4903]: E0320 08:24:28.454653 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:28 crc kubenswrapper[4903]: E0320 08:24:28.555843 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:28 crc kubenswrapper[4903]: E0320 08:24:28.657632 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:28 crc kubenswrapper[4903]: E0320 08:24:28.758408 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:28 crc kubenswrapper[4903]: E0320 08:24:28.859065 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:28 crc kubenswrapper[4903]: E0320 08:24:28.959845 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:29 crc kubenswrapper[4903]: E0320 08:24:29.060948 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:29 crc kubenswrapper[4903]: E0320 08:24:29.161465 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:29 crc kubenswrapper[4903]: E0320 08:24:29.262430 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:29 crc kubenswrapper[4903]: E0320 08:24:29.362894 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:29 crc kubenswrapper[4903]: E0320 08:24:29.463693 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:29 crc kubenswrapper[4903]: E0320 08:24:29.564248 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:29 crc kubenswrapper[4903]: E0320 08:24:29.665199 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:29 crc kubenswrapper[4903]: E0320 08:24:29.766300 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:29 crc kubenswrapper[4903]: E0320 08:24:29.866738 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:29 crc kubenswrapper[4903]: E0320 08:24:29.967153 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:30 crc kubenswrapper[4903]: E0320 08:24:30.067846 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:30 crc kubenswrapper[4903]: E0320 08:24:30.168674 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:30 crc kubenswrapper[4903]: E0320 08:24:30.269319 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:30 crc kubenswrapper[4903]: E0320 08:24:30.370407 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:30 crc kubenswrapper[4903]: E0320 08:24:30.471058 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:30 crc kubenswrapper[4903]: E0320 08:24:30.572217 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:30 crc kubenswrapper[4903]: E0320 08:24:30.673333 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:30 crc kubenswrapper[4903]: E0320 08:24:30.774480 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:30 crc kubenswrapper[4903]: E0320 08:24:30.874733 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:30 crc kubenswrapper[4903]: E0320 08:24:30.975631 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:31 crc kubenswrapper[4903]: E0320 08:24:31.076210 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:31 crc kubenswrapper[4903]: E0320 08:24:31.176576 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:31 crc kubenswrapper[4903]: E0320 08:24:31.277331 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:31 crc kubenswrapper[4903]: E0320 08:24:31.377472 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:31 crc kubenswrapper[4903]: E0320 08:24:31.478601 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:31 crc kubenswrapper[4903]: E0320 08:24:31.578780 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:31 crc kubenswrapper[4903]: E0320 08:24:31.679279 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:31 crc kubenswrapper[4903]: E0320 08:24:31.780220 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:31 crc kubenswrapper[4903]: E0320 08:24:31.880376 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:31 crc kubenswrapper[4903]: E0320 08:24:31.980877 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.081973 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.182858 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.283104 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.383987 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.484590 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.585363 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.686252 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.731093 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.736885 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.737135 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.737309 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.737491 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.737653 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:32Z","lastTransitionTime":"2026-03-20T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.754387 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.759390 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.759453 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.759480 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.759511 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.759534 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:32Z","lastTransitionTime":"2026-03-20T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.776300 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.781760 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.781811 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.781833 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.781861 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.781884 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:32Z","lastTransitionTime":"2026-03-20T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.795852 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.800597 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.800679 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.800704 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.800734 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:32 crc kubenswrapper[4903]: I0320 08:24:32.800755 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:32Z","lastTransitionTime":"2026-03-20T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.816256 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.816515 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.816565 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:32 crc kubenswrapper[4903]: E0320 08:24:32.917020 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:33 crc kubenswrapper[4903]: E0320 08:24:33.018023 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:33 crc kubenswrapper[4903]: E0320 08:24:33.118204 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:33 crc kubenswrapper[4903]: E0320 08:24:33.219382 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:33 crc kubenswrapper[4903]: E0320 08:24:33.319673 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:33 crc kubenswrapper[4903]: E0320 08:24:33.420320 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:33 crc kubenswrapper[4903]: E0320 08:24:33.520870 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:33 crc kubenswrapper[4903]: E0320 08:24:33.621275 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:33 crc kubenswrapper[4903]: E0320 08:24:33.722442 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:33 crc kubenswrapper[4903]: E0320 08:24:33.822936 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:33 crc kubenswrapper[4903]: E0320 08:24:33.925152 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.026174 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.126896 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.228218 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.328792 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.430020 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:34 crc kubenswrapper[4903]: I0320 08:24:34.490504 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:34 crc kubenswrapper[4903]: I0320 08:24:34.491857 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:34 crc kubenswrapper[4903]: I0320 08:24:34.491890 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:34 crc kubenswrapper[4903]: I0320 08:24:34.491904 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:34 crc kubenswrapper[4903]: I0320 08:24:34.492575 4903 scope.go:117] "RemoveContainer" containerID="18e2da8edfca6135b75a943a7fe0a7954decfbea090fffa730262edb294ae4fa" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.492772 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.530948 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.632010 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.733086 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.833537 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:34 crc kubenswrapper[4903]: E0320 08:24:34.933708 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.034576 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.135146 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.235416 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.336189 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.437278 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.538426 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.588051 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.639221 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.740132 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.840377 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:35 crc kubenswrapper[4903]: E0320 08:24:35.941141 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:36 crc kubenswrapper[4903]: E0320 08:24:36.042130 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:36 crc kubenswrapper[4903]: E0320 08:24:36.142714 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:36 crc kubenswrapper[4903]: E0320 08:24:36.243857 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:36 crc kubenswrapper[4903]: I0320 08:24:36.326491 4903 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 08:24:36 crc kubenswrapper[4903]: E0320 08:24:36.345101 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:36 crc kubenswrapper[4903]: E0320 08:24:36.445381 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:36 crc kubenswrapper[4903]: E0320 08:24:36.546465 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:36 crc kubenswrapper[4903]: E0320 08:24:36.647764 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:36 crc kubenswrapper[4903]: E0320 08:24:36.748426 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:36 crc kubenswrapper[4903]: E0320 08:24:36.848675 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:36 crc kubenswrapper[4903]: E0320 08:24:36.949057 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:37 crc kubenswrapper[4903]: E0320 08:24:37.050203 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:37 crc kubenswrapper[4903]: E0320 08:24:37.151400 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:37 crc kubenswrapper[4903]: E0320 08:24:37.252527 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:37 crc kubenswrapper[4903]: E0320 08:24:37.353221 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:37 crc kubenswrapper[4903]: E0320 08:24:37.454001 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:37 crc kubenswrapper[4903]: E0320 08:24:37.554644 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:37 crc kubenswrapper[4903]: E0320 08:24:37.655413 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:37 crc kubenswrapper[4903]: E0320 08:24:37.756027 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:37 crc kubenswrapper[4903]: E0320 08:24:37.856829 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:37 crc kubenswrapper[4903]: E0320 08:24:37.957191 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:38 crc kubenswrapper[4903]: E0320 08:24:38.058208 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:38 crc kubenswrapper[4903]: E0320 08:24:38.159417 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:38 crc kubenswrapper[4903]: E0320 08:24:38.260583 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:38 crc kubenswrapper[4903]: E0320 08:24:38.361130 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:38 crc kubenswrapper[4903]: E0320 08:24:38.462160 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:38 crc kubenswrapper[4903]: E0320 08:24:38.562830 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:38 crc kubenswrapper[4903]: E0320 08:24:38.663896 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:38 crc kubenswrapper[4903]: E0320 08:24:38.764947 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:38 crc kubenswrapper[4903]: E0320 08:24:38.866102 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:38 crc kubenswrapper[4903]: E0320 08:24:38.966863 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:39 crc kubenswrapper[4903]: E0320 08:24:39.067531 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:39 crc kubenswrapper[4903]: E0320 08:24:39.168343 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:39 crc kubenswrapper[4903]: E0320 08:24:39.268808 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:39 crc kubenswrapper[4903]: E0320 08:24:39.369431 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:39 crc kubenswrapper[4903]: E0320 08:24:39.470348 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:39 crc kubenswrapper[4903]: I0320 08:24:39.490854 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:39 crc kubenswrapper[4903]: I0320 08:24:39.492306 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:39 crc kubenswrapper[4903]: I0320 08:24:39.492365 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:39 crc kubenswrapper[4903]: I0320 08:24:39.492382 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:39 crc kubenswrapper[4903]: E0320 08:24:39.571228 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:39 crc kubenswrapper[4903]: E0320 08:24:39.671789 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:39 crc kubenswrapper[4903]: E0320 08:24:39.772583 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:39 crc kubenswrapper[4903]: E0320 08:24:39.873592 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:39 crc kubenswrapper[4903]: E0320 08:24:39.974982 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:40 crc kubenswrapper[4903]: E0320 08:24:40.075542 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:40 crc kubenswrapper[4903]: E0320 08:24:40.176728 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:40 crc kubenswrapper[4903]: E0320 08:24:40.276949 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:40 crc kubenswrapper[4903]: E0320 08:24:40.377912 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:40 crc kubenswrapper[4903]: E0320 08:24:40.479192 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:40 crc kubenswrapper[4903]: E0320 08:24:40.579428 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:40 crc kubenswrapper[4903]: E0320 08:24:40.680419 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:40 crc kubenswrapper[4903]: E0320 08:24:40.781348 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:40 crc kubenswrapper[4903]: E0320 08:24:40.882590 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:40 crc kubenswrapper[4903]: E0320 08:24:40.983683 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:41 crc kubenswrapper[4903]: E0320 08:24:41.084546 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:41 crc kubenswrapper[4903]: E0320 08:24:41.185549 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:41 crc kubenswrapper[4903]: E0320 08:24:41.285837 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:41 crc kubenswrapper[4903]: E0320 08:24:41.386332 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:41 crc kubenswrapper[4903]: E0320 08:24:41.486695 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:41 crc kubenswrapper[4903]: E0320 08:24:41.587504 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:41 crc kubenswrapper[4903]: E0320 08:24:41.688420 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:41 crc kubenswrapper[4903]: E0320 08:24:41.789176 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:41 crc kubenswrapper[4903]: E0320 08:24:41.890169 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:41 crc kubenswrapper[4903]: E0320 08:24:41.991200 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:42 crc kubenswrapper[4903]: E0320 08:24:42.092352 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:42 crc kubenswrapper[4903]: E0320 08:24:42.193123 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:42 crc kubenswrapper[4903]: E0320 08:24:42.294376 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:42 crc kubenswrapper[4903]: E0320 08:24:42.395352 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:42 crc kubenswrapper[4903]: E0320 08:24:42.495801 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:42 crc kubenswrapper[4903]: E0320 08:24:42.596536 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:42 crc kubenswrapper[4903]: E0320 08:24:42.697631 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:42 crc kubenswrapper[4903]: E0320 08:24:42.797867 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:42 crc kubenswrapper[4903]: E0320 08:24:42.898911 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:42 crc kubenswrapper[4903]: E0320 08:24:42.999838 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.098830 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.104163 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.104202 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.104212 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.104228 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.104239 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:43Z","lastTransitionTime":"2026-03-20T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.112422 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.116822 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.116879 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.116893 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.116922 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.116937 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:43Z","lastTransitionTime":"2026-03-20T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.128847 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.133280 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.133343 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.133369 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.133404 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.133455 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:43Z","lastTransitionTime":"2026-03-20T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.149889 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.155612 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.155693 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.155719 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.155754 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:43 crc kubenswrapper[4903]: I0320 08:24:43.155783 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:43Z","lastTransitionTime":"2026-03-20T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.173583 4903 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2fafe47f-e5df-46e6-9c53-b5b631ab61f4\\\",\\\"systemUUID\\\":\\\"39716343-11aa-4130-bd5e-584ebc4907c0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.173870 4903 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.173924 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.274767 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.375771 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.476359 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.576777 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.677522 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.778705 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.879487 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:43 crc kubenswrapper[4903]: E0320 08:24:43.979846 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:44 crc kubenswrapper[4903]: E0320 08:24:44.080211 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:44 crc kubenswrapper[4903]: E0320 08:24:44.181334 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:44 crc kubenswrapper[4903]: E0320 08:24:44.282153 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:44 crc kubenswrapper[4903]: E0320 08:24:44.383254 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:44 crc kubenswrapper[4903]: E0320 08:24:44.484131 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:44 crc kubenswrapper[4903]: E0320 08:24:44.585416 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:44 crc kubenswrapper[4903]: E0320 08:24:44.686611 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:44 crc kubenswrapper[4903]: E0320 08:24:44.787915 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:44 crc kubenswrapper[4903]: E0320 08:24:44.888921 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:44 crc kubenswrapper[4903]: E0320 08:24:44.989777 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.090798 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.191632 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.291802 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.392513 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.493028 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.588774 4903 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.593125 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.693591 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.794271 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.895681 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:45 crc kubenswrapper[4903]: E0320 08:24:45.997128 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:46 crc kubenswrapper[4903]: E0320 08:24:46.097982 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:46 crc kubenswrapper[4903]: E0320 08:24:46.198842 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:46 crc kubenswrapper[4903]: E0320 08:24:46.300010 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:46 crc kubenswrapper[4903]: E0320 08:24:46.401160 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.490776 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.492545 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.492589 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.492598 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.493330 4903 scope.go:117] "RemoveContainer" containerID="18e2da8edfca6135b75a943a7fe0a7954decfbea090fffa730262edb294ae4fa" Mar 20 08:24:46 crc kubenswrapper[4903]: E0320 08:24:46.502116 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:46 crc kubenswrapper[4903]: E0320 08:24:46.603007 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:46 crc kubenswrapper[4903]: E0320 08:24:46.703977 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:46 crc kubenswrapper[4903]: E0320 08:24:46.804975 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:46 crc kubenswrapper[4903]: E0320 08:24:46.906093 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.990267 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.992654 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4"} Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.992785 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.993780 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.993833 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:46 crc kubenswrapper[4903]: I0320 08:24:46.993859 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:47 crc kubenswrapper[4903]: E0320 08:24:47.006626 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:47 crc kubenswrapper[4903]: E0320 08:24:47.107284 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:47 crc kubenswrapper[4903]: E0320 08:24:47.207905 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:47 crc kubenswrapper[4903]: E0320 08:24:47.308652 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:47 crc kubenswrapper[4903]: E0320 08:24:47.409603 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:47 crc kubenswrapper[4903]: E0320 08:24:47.510770 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:47 crc kubenswrapper[4903]: E0320 08:24:47.611372 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:47 crc kubenswrapper[4903]: E0320 08:24:47.712153 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:47 crc kubenswrapper[4903]: E0320 08:24:47.813091 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:47 crc kubenswrapper[4903]: E0320 08:24:47.913943 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:47 crc kubenswrapper[4903]: I0320 08:24:47.997232 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 20 08:24:47 crc kubenswrapper[4903]: I0320 08:24:47.997574 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 08:24:47 crc kubenswrapper[4903]: I0320 08:24:47.998947 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" exitCode=255 Mar 20 08:24:47 crc kubenswrapper[4903]: I0320 08:24:47.998979 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4"} Mar 20 08:24:47 crc kubenswrapper[4903]: I0320 08:24:47.999013 4903 scope.go:117] "RemoveContainer" containerID="18e2da8edfca6135b75a943a7fe0a7954decfbea090fffa730262edb294ae4fa" Mar 20 08:24:47 crc kubenswrapper[4903]: I0320 08:24:47.999229 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:48 crc kubenswrapper[4903]: I0320 08:24:48.000543 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:48 crc kubenswrapper[4903]: I0320 08:24:48.000574 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:48 crc kubenswrapper[4903]: I0320 08:24:48.000584 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:48 crc kubenswrapper[4903]: I0320 08:24:48.001187 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.001343 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:24:48 crc kubenswrapper[4903]: I0320 08:24:48.010252 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.014398 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.114909 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.215583 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.316464 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.417517 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.518424 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.619117 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.719409 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.820127 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:48 crc kubenswrapper[4903]: E0320 08:24:48.920542 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.003590 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.005526 4903 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.006281 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.006308 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.006317 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.006723 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:24:49 crc kubenswrapper[4903]: E0320 08:24:49.006853 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:24:49 crc kubenswrapper[4903]: E0320 08:24:49.021737 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:49 crc kubenswrapper[4903]: E0320 08:24:49.122512 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:49 crc kubenswrapper[4903]: E0320 08:24:49.222891 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.267411 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:24:49 crc kubenswrapper[4903]: E0320 08:24:49.323792 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:49 crc kubenswrapper[4903]: E0320 08:24:49.424303 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:49 crc kubenswrapper[4903]: E0320 08:24:49.524426 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:49 crc kubenswrapper[4903]: E0320 08:24:49.625212 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:49 crc kubenswrapper[4903]: E0320 08:24:49.726112 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:49 crc kubenswrapper[4903]: E0320 08:24:49.826269 4903 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.858927 4903 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.929499 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.929545 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.929554 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.929571 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:49 crc kubenswrapper[4903]: I0320 08:24:49.929582 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:49Z","lastTransitionTime":"2026-03-20T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.029716 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.030439 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.033882 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.033916 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.033934 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.033955 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.033972 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:50Z","lastTransitionTime":"2026-03-20T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.136557 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.136603 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.136616 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.136634 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.136648 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:50Z","lastTransitionTime":"2026-03-20T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.239760 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.239822 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.239841 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.239865 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.239882 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:50Z","lastTransitionTime":"2026-03-20T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.343836 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.343909 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.343934 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.343965 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.343991 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:50Z","lastTransitionTime":"2026-03-20T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.447272 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.447674 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.447900 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.448108 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.448287 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:50Z","lastTransitionTime":"2026-03-20T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.448712 4903 apiserver.go:52] "Watching apiserver" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.452973 4903 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.453476 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-55qrw","openshift-machine-config-operator/machine-config-daemon-2ndsj","openshift-multus/multus-nzq6s","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-mxv95","openshift-kube-apiserver/kube-apiserver-crc","openshift-multus/multus-additional-cni-plugins-qp7cv","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h","openshift-ovn-kubernetes/ovnkube-node-m6k77"] Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.453898 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.454148 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.454007 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.454643 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.454422 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.453926 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.454630 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.454765 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.454831 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.454886 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.455401 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mxv95" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.455497 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.455656 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.455776 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.456654 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.459497 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.460291 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.460800 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.461017 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.461176 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.461359 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.461422 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.462409 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.462889 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.464352 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.464605 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.464427 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.464472 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.464496 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.465138 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.465637 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.465640 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.465750 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.465768 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.467578 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.468115 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.468500 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.468922 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.469126 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.469228 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.469383 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.468935 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.469532 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.469618 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.469620 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.469720 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.469820 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.470024 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.470201 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.470346 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.490012 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.506126 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-55qrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0b2ecd-79af-4d88-ac59-3a08385882a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wx8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-55qrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.506417 4903 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.527127 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"867ac3a2-4567-43d3-80af-23021ced20b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qp7cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.547811 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"157214e8-fbfe-4e9d-98f4-02680437b8b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m6k77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.554347 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.554386 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.554396 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.554412 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.554422 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:50Z","lastTransitionTime":"2026-03-20T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.563229 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e53791c9-7f9f-4ce5-8c13-29786721b9e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T08:24:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 08:24:47.109709 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 08:24:47.109817 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 08:24:47.110391 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2254287008/tls.crt::/tmp/serving-cert-2254287008/tls.key\\\\\\\"\\\\nI0320 08:24:47.892679 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 08:24:47.894329 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 08:24:47.894348 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 08:24:47.894369 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 08:24:47.894375 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 08:24:47.900498 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 08:24:47.900532 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 08:24:47.900537 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 08:24:47.900544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 08:24:47.900554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 08:24:47.900558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 08:24:47.900562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 08:24:47.900763 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 08:24:47.902854 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T08:24:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:22:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:22:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:22:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.568930 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.568975 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569002 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569026 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569080 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569106 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569128 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569150 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569175 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569198 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569218 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569240 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569260 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569279 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569297 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569317 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569339 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569359 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569378 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569402 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569430 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569457 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569477 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569480 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569498 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569520 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569510 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569597 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569545 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569721 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569757 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569789 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569801 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569819 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569881 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569914 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569942 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569958 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569973 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.569988 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570003 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570072 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570108 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570149 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570185 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570195 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570214 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570246 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570278 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570306 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570336 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570368 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570396 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570423 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570426 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570467 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570552 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570590 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570565 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570614 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570639 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570668 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570695 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570715 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570739 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570762 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570787 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570793 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570812 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570841 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570876 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570900 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570924 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570951 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570976 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571018 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571073 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571101 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571122 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571161 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571187 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571223 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571244 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571268 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571294 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571322 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571351 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571375 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571399 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571421 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571445 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571468 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571492 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571515 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571541 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571564 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571586 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571607 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571627 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571654 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571674 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572284 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572309 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572333 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572359 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572384 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572409 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572433 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572457 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572507 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572531 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572559 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572584 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572602 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572619 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572677 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572695 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572717 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572783 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572804 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572819 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572849 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572866 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572882 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572898 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572914 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572932 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572948 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572972 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572988 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573007 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573787 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573833 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573909 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573943 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573984 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574018 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574083 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574116 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574154 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574197 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574240 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574275 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574306 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574335 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574368 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574401 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574434 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574467 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574502 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574548 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574648 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574727 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574772 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574806 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575021 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575100 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575139 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575173 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575263 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575302 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.570919 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575382 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571243 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571319 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571418 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571441 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571559 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571952 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.571963 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.572000 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573229 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573300 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573424 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573453 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573473 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.573733 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574099 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574231 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574674 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.574723 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575185 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575258 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575270 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575330 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575355 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575396 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575693 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575699 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575726 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575753 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575778 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575801 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575819 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575824 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575843 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575870 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.575898 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576211 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576216 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576204 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576321 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576350 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576377 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576381 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576448 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576477 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576500 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576529 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576553 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576579 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576603 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576626 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576648 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576673 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576683 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576697 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576837 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576874 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576876 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576897 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576941 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576953 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.577132 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578416 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.577510 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.577781 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578540 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578602 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.577816 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.577951 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578747 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578762 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578018 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578117 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578258 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578499 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578924 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578879 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.579167 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.578287 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.579512 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.579574 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.579923 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.580128 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.580310 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.580660 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.580811 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.581071 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.581052 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.581171 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.581173 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.581346 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.581410 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.581431 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.582355 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.582543 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.582556 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.582577 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.582603 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.582747 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.582774 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.582785 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.582555 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.582776 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.583504 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.583805 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.583921 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.584061 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:24:51.084008886 +0000 UTC m=+116.300909211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.584059 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.584809 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.585125 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.585169 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.585236 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.585426 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.585529 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.585675 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.585935 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.585952 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586408 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586452 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586712 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586751 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.576981 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586808 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586869 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586893 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586917 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586941 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586965 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586990 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587016 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587059 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587083 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587107 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587130 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586814 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587154 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.586992 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587180 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587204 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587225 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587248 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587272 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587294 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587315 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587418 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgflr\" (UniqueName: \"kubernetes.io/projected/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-kube-api-access-bgflr\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587466 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587502 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587536 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587543 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587572 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-os-release\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587606 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587639 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-systemd\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587669 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-cni-dir\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587697 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-os-release\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587729 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-hostroot\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587766 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-system-cni-dir\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587797 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-cnibin\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587827 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-run-k8s-cni-cncf-io\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587862 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587890 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-slash\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587921 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-run-multus-certs\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587949 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovn-node-metrics-cert\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587977 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-run-netns\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588010 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rlvr\" (UniqueName: \"kubernetes.io/projected/0e67af70-4211-4077-8f2b-0a00b8069e5a-kube-api-access-5rlvr\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588104 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-openvswitch\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588147 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-bin\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588179 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z897\" (UniqueName: \"kubernetes.io/projected/157214e8-fbfe-4e9d-98f4-02680437b8b2-kube-api-access-9z897\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588211 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a0b2ecd-79af-4d88-ac59-3a08385882a1-host\") pod \"node-ca-55qrw\" (UID: \"1a0b2ecd-79af-4d88-ac59-3a08385882a1\") " pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588238 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a0b2ecd-79af-4d88-ac59-3a08385882a1-serviceca\") pod \"node-ca-55qrw\" (UID: \"1a0b2ecd-79af-4d88-ac59-3a08385882a1\") " pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588289 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-daemon-config\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588322 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-etc-kubernetes\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588354 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0e67af70-4211-4077-8f2b-0a00b8069e5a-rootfs\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588383 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-netd\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588412 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-socket-dir-parent\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588870 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-var-lib-cni-bin\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588967 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-var-lib-cni-multus\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587602 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.587953 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588590 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588613 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588664 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588815 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588840 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588847 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.588970 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.589601 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.589761 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.589931 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.589961 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.590279 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.590321 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.590452 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.590845 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:51.090819154 +0000 UTC m=+116.307719479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591129 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591006 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591148 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-conf-dir\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591235 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591255 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591283 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-systemd-units\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591334 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-node-log\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591487 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591531 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591543 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591663 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591744 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e67af70-4211-4077-8f2b-0a00b8069e5a-proxy-tls\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591743 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591777 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591809 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.591856 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592111 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592177 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-config\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592184 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.592249 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592294 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx8vl\" (UniqueName: \"kubernetes.io/projected/1a0b2ecd-79af-4d88-ac59-3a08385882a1-kube-api-access-wx8vl\") pod \"node-ca-55qrw\" (UID: \"1a0b2ecd-79af-4d88-ac59-3a08385882a1\") " pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592379 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2jpf\" (UniqueName: \"kubernetes.io/projected/867ac3a2-4567-43d3-80af-23021ced20b6-kube-api-access-x2jpf\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592432 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592444 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0e67af70-4211-4077-8f2b-0a00b8069e5a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592495 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-log-socket\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592541 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-script-lib\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592604 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnpkw\" (UniqueName: \"kubernetes.io/projected/bc3a30d7-d1f1-491a-8e04-70eea2d35867-kube-api-access-hnpkw\") pod \"node-resolver-mxv95\" (UID: \"bc3a30d7-d1f1-491a-8e04-70eea2d35867\") " pod="openshift-dns/node-resolver-mxv95" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592635 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592651 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/867ac3a2-4567-43d3-80af-23021ced20b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592712 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592776 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-var-lib-openvswitch\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592887 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-etc-openvswitch\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592898 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592913 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592952 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-system-cni-dir\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.592977 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-var-lib-kubelet\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.593061 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.593299 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.593304 4903 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.593336 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e67af70-4211-4077-8f2b-0a00b8069e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rlvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rlvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2ndsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.593459 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-ovn\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.593545 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.593604 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc3a30d7-d1f1-491a-8e04-70eea2d35867-hosts-file\") pod \"node-resolver-mxv95\" (UID: \"bc3a30d7-d1f1-491a-8e04-70eea2d35867\") " pod="openshift-dns/node-resolver-mxv95" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.593903 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.593993 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594457 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/867ac3a2-4567-43d3-80af-23021ced20b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594498 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-cni-binary-copy\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594530 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594648 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-kubelet\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594702 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-netns\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594728 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-cnibin\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594755 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-ovn-kubernetes\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594778 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-env-overrides\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594929 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594946 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594964 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594979 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.594993 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595009 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595025 4903 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595059 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595072 4903 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595086 4903 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595283 4903 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595298 4903 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595313 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595327 4903 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595343 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595353 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595363 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595406 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595429 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595450 4903 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595466 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595570 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595590 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595614 4903 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595632 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595648 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595663 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595681 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595697 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.595853 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:51.095809076 +0000 UTC m=+116.312709401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595920 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.595949 4903 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596012 4903 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596047 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596064 4903 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596066 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596081 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596138 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596155 4903 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596169 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596190 4903 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596205 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596220 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596234 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596247 4903 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596262 4903 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596276 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596291 4903 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596305 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596321 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596335 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596348 4903 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596360 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596381 4903 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596394 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596406 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596418 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596432 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596444 4903 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596459 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596472 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596484 4903 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596495 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596507 4903 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596519 4903 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596534 4903 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596546 4903 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596558 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596571 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596585 4903 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596597 4903 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596608 4903 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596620 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596632 4903 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596643 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596655 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596667 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596678 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596690 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596712 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596725 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596742 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596754 4903 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596766 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596778 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596790 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596802 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596813 4903 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596825 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596844 4903 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596855 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596867 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596880 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596892 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596904 4903 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596915 4903 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596927 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596938 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596950 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596975 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.596986 4903 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597002 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597014 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597025 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597059 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597076 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597092 4903 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597107 4903 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597123 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597135 4903 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597147 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597159 4903 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597171 4903 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597182 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597194 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597206 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597217 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597237 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597252 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597263 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597277 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597740 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.598181 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599206 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599400 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.597289 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599481 4903 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599759 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599788 4903 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599911 4903 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599927 4903 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599928 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599956 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599973 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.599987 4903 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.600001 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.600015 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.600041 4903 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.600056 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.600069 4903 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.600081 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.600075 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.603431 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.603861 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.604242 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.604317 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.605359 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.605390 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.605864 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.606086 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.606396 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.609835 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.609986 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.611354 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.611376 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.611390 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.611458 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:51.111436088 +0000 UTC m=+116.328336503 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.611730 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.615454 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.615448 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.616095 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.616133 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.616146 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:50 crc kubenswrapper[4903]: E0320 08:24:50.616217 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:51.116198234 +0000 UTC m=+116.333098549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.616331 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.616371 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.616658 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.616669 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.616766 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.616788 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.617301 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.617420 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.617831 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.617854 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.618257 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.618523 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.618715 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.618727 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.619101 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.619922 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.617132 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.621363 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.622230 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.622350 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.622522 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.622720 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.622741 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.622842 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.623382 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.627448 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.627594 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.628014 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.632147 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.633505 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.634006 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.634404 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.634423 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.633722 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.635105 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.635261 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.635251 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.635669 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.636414 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.648664 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.649353 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nzq6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nzq6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.659153 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.659214 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.659234 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.659265 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.659487 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:50Z","lastTransitionTime":"2026-03-20T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.659718 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.660890 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.666555 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.669697 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mxv95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3a30d7-d1f1-491a-8e04-70eea2d35867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnpkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mxv95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.677851 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.685260 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700669 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e67af70-4211-4077-8f2b-0a00b8069e5a-proxy-tls\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700732 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700760 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx8vl\" (UniqueName: \"kubernetes.io/projected/1a0b2ecd-79af-4d88-ac59-3a08385882a1-kube-api-access-wx8vl\") pod \"node-ca-55qrw\" (UID: \"1a0b2ecd-79af-4d88-ac59-3a08385882a1\") " pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700787 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2jpf\" (UniqueName: \"kubernetes.io/projected/867ac3a2-4567-43d3-80af-23021ced20b6-kube-api-access-x2jpf\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700824 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-config\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700849 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-log-socket\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700872 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-script-lib\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700907 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0e67af70-4211-4077-8f2b-0a00b8069e5a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700931 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-var-lib-openvswitch\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700957 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-etc-openvswitch\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700943 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.700983 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnpkw\" (UniqueName: \"kubernetes.io/projected/bc3a30d7-d1f1-491a-8e04-70eea2d35867-kube-api-access-hnpkw\") pod \"node-resolver-mxv95\" (UID: \"bc3a30d7-d1f1-491a-8e04-70eea2d35867\") " pod="openshift-dns/node-resolver-mxv95" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701007 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/867ac3a2-4567-43d3-80af-23021ced20b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701064 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701076 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-var-lib-openvswitch\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701086 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-ovn\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701324 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-ovn\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701352 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701383 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-etc-openvswitch\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701385 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-system-cni-dir\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701414 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-system-cni-dir\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701428 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-var-lib-kubelet\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701464 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-kubelet\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701490 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-netns\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701523 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc3a30d7-d1f1-491a-8e04-70eea2d35867-hosts-file\") pod \"node-resolver-mxv95\" (UID: \"bc3a30d7-d1f1-491a-8e04-70eea2d35867\") " pod="openshift-dns/node-resolver-mxv95" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701547 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/867ac3a2-4567-43d3-80af-23021ced20b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701577 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-cni-binary-copy\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701609 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-ovn-kubernetes\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701649 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-env-overrides\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701678 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-cnibin\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701698 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-kubelet\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701701 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-os-release\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701788 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgflr\" (UniqueName: \"kubernetes.io/projected/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-kube-api-access-bgflr\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701832 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-cni-dir\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701855 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-os-release\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701877 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-systemd\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701902 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-cnibin\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701925 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-run-k8s-cni-cncf-io\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701959 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-hostroot\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.701981 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-system-cni-dir\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702005 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-slash\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702050 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-run-multus-certs\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702102 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702462 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-script-lib\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702504 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0e67af70-4211-4077-8f2b-0a00b8069e5a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702550 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-log-socket\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702559 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702607 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/867ac3a2-4567-43d3-80af-23021ced20b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702623 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-cnibin\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702685 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-var-lib-kubelet\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702830 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc3a30d7-d1f1-491a-8e04-70eea2d35867-hosts-file\") pod \"node-resolver-mxv95\" (UID: \"bc3a30d7-d1f1-491a-8e04-70eea2d35867\") " pod="openshift-dns/node-resolver-mxv95" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702858 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-netns\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702918 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-cni-dir\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.702977 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-os-release\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703012 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-systemd\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703099 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-hostroot\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703135 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-ovn-kubernetes\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703145 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-config\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703173 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-run-k8s-cni-cncf-io\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703235 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-system-cni-dir\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703241 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-cnibin\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703288 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-slash\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703325 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-os-release\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703414 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-run-multus-certs\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703417 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/867ac3a2-4567-43d3-80af-23021ced20b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703583 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-env-overrides\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703853 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-cni-binary-copy\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703963 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-openvswitch\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704049 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-bin\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.703994 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-openvswitch\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704078 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovn-node-metrics-cert\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704132 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-run-netns\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704158 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rlvr\" (UniqueName: \"kubernetes.io/projected/0e67af70-4211-4077-8f2b-0a00b8069e5a-kube-api-access-5rlvr\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704177 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z897\" (UniqueName: \"kubernetes.io/projected/157214e8-fbfe-4e9d-98f4-02680437b8b2-kube-api-access-9z897\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704193 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0e67af70-4211-4077-8f2b-0a00b8069e5a-rootfs\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704208 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-netd\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704225 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a0b2ecd-79af-4d88-ac59-3a08385882a1-host\") pod \"node-ca-55qrw\" (UID: \"1a0b2ecd-79af-4d88-ac59-3a08385882a1\") " pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704240 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a0b2ecd-79af-4d88-ac59-3a08385882a1-serviceca\") pod \"node-ca-55qrw\" (UID: \"1a0b2ecd-79af-4d88-ac59-3a08385882a1\") " pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704255 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-daemon-config\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704274 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-etc-kubernetes\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704292 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-node-log\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704308 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704323 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-socket-dir-parent\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704342 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-var-lib-cni-bin\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704357 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-var-lib-cni-multus\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704374 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-conf-dir\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704397 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-systemd-units\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704477 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704489 4903 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704502 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704513 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704524 4903 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704534 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704543 4903 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704553 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704561 4903 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704571 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704580 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704589 4903 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704598 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704611 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704623 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704634 4903 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704644 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704652 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704660 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704671 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704682 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704691 4903 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704699 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704707 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704717 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704726 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704734 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704743 4903 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704751 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704759 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704768 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704776 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704786 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704795 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704803 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704811 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704821 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704831 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704908 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704917 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704928 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704945 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-bin\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704995 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-run-netns\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705004 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-systemd-units\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.704963 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705083 4903 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705107 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705124 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705137 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705155 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705175 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705195 4903 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705215 4903 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705232 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705249 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705265 4903 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705282 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705298 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705314 4903 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705821 4903 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705850 4903 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705869 4903 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705888 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705907 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705923 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.705973 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a0b2ecd-79af-4d88-ac59-3a08385882a1-host\") pod \"node-ca-55qrw\" (UID: \"1a0b2ecd-79af-4d88-ac59-3a08385882a1\") " pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.706014 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0e67af70-4211-4077-8f2b-0a00b8069e5a-rootfs\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.706087 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-netd\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.706221 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-node-log\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.706386 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-etc-kubernetes\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.707019 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/867ac3a2-4567-43d3-80af-23021ced20b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.707082 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-socket-dir-parent\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.707105 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-var-lib-cni-bin\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.707125 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-host-var-lib-cni-multus\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.707144 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-conf-dir\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.708023 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a0b2ecd-79af-4d88-ac59-3a08385882a1-serviceca\") pod \"node-ca-55qrw\" (UID: \"1a0b2ecd-79af-4d88-ac59-3a08385882a1\") " pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.708302 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-multus-daemon-config\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.714591 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovn-node-metrics-cert\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.719681 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx8vl\" (UniqueName: \"kubernetes.io/projected/1a0b2ecd-79af-4d88-ac59-3a08385882a1-kube-api-access-wx8vl\") pod \"node-ca-55qrw\" (UID: \"1a0b2ecd-79af-4d88-ac59-3a08385882a1\") " pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.719864 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e67af70-4211-4077-8f2b-0a00b8069e5a-proxy-tls\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.720971 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2jpf\" (UniqueName: \"kubernetes.io/projected/867ac3a2-4567-43d3-80af-23021ced20b6-kube-api-access-x2jpf\") pod \"multus-additional-cni-plugins-qp7cv\" (UID: \"867ac3a2-4567-43d3-80af-23021ced20b6\") " pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.722760 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgflr\" (UniqueName: \"kubernetes.io/projected/4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e-kube-api-access-bgflr\") pod \"multus-nzq6s\" (UID: \"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\") " pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.723744 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnpkw\" (UniqueName: \"kubernetes.io/projected/bc3a30d7-d1f1-491a-8e04-70eea2d35867-kube-api-access-hnpkw\") pod \"node-resolver-mxv95\" (UID: \"bc3a30d7-d1f1-491a-8e04-70eea2d35867\") " pod="openshift-dns/node-resolver-mxv95" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.724045 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z897\" (UniqueName: \"kubernetes.io/projected/157214e8-fbfe-4e9d-98f4-02680437b8b2-kube-api-access-9z897\") pod \"ovnkube-node-m6k77\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.724315 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rlvr\" (UniqueName: \"kubernetes.io/projected/0e67af70-4211-4077-8f2b-0a00b8069e5a-kube-api-access-5rlvr\") pod \"machine-config-daemon-2ndsj\" (UID: \"0e67af70-4211-4077-8f2b-0a00b8069e5a\") " pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.763059 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.763102 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.763111 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.763128 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.763139 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:50Z","lastTransitionTime":"2026-03-20T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.775238 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.786065 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 08:24:50 crc kubenswrapper[4903]: W0320 08:24:50.790845 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-1c15bc5da64bf1abd649a0c7b55d4923ffe47dfc8027301d49c8271b0c77d26b WatchSource:0}: Error finding container 1c15bc5da64bf1abd649a0c7b55d4923ffe47dfc8027301d49c8271b0c77d26b: Status 404 returned error can't find the container with id 1c15bc5da64bf1abd649a0c7b55d4923ffe47dfc8027301d49c8271b0c77d26b Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.794659 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 08:24:50 crc kubenswrapper[4903]: W0320 08:24:50.800939 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-15979e65a347710fabf0d74766f9affac6e390da796139447d1042e0e55c57ea WatchSource:0}: Error finding container 15979e65a347710fabf0d74766f9affac6e390da796139447d1042e0e55c57ea: Status 404 returned error can't find the container with id 15979e65a347710fabf0d74766f9affac6e390da796139447d1042e0e55c57ea Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.803873 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-55qrw" Mar 20 08:24:50 crc kubenswrapper[4903]: W0320 08:24:50.810821 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-81fcfbc938c8aa56218e828e73797f763b8100603f9623edc790db5eea704fde WatchSource:0}: Error finding container 81fcfbc938c8aa56218e828e73797f763b8100603f9623edc790db5eea704fde: Status 404 returned error can't find the container with id 81fcfbc938c8aa56218e828e73797f763b8100603f9623edc790db5eea704fde Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.815158 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.822812 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nzq6s" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.832692 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.841932 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mxv95" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.851158 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:50 crc kubenswrapper[4903]: W0320 08:24:50.854704 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod867ac3a2_4567_43d3_80af_23021ced20b6.slice/crio-e2ffcc6d3fd95546077fb44fb1e3c343cc2d26542c4814ad9647de8f5f81eb0a WatchSource:0}: Error finding container e2ffcc6d3fd95546077fb44fb1e3c343cc2d26542c4814ad9647de8f5f81eb0a: Status 404 returned error can't find the container with id e2ffcc6d3fd95546077fb44fb1e3c343cc2d26542c4814ad9647de8f5f81eb0a Mar 20 08:24:50 crc kubenswrapper[4903]: W0320 08:24:50.856740 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2f8d10_bf3a_48a4_9e71_2d3b5dc2743e.slice/crio-97c1a73e17c515e5cb87749372e03f69b3917129ebd9cb13de334353535e90b6 WatchSource:0}: Error finding container 97c1a73e17c515e5cb87749372e03f69b3917129ebd9cb13de334353535e90b6: Status 404 returned error can't find the container with id 97c1a73e17c515e5cb87749372e03f69b3917129ebd9cb13de334353535e90b6 Mar 20 08:24:50 crc kubenswrapper[4903]: W0320 08:24:50.863668 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e67af70_4211_4077_8f2b_0a00b8069e5a.slice/crio-fe12c518bc804c01de3b3d28c9eff05bee6befb21035b085b2804fb6c436e479 WatchSource:0}: Error finding container fe12c518bc804c01de3b3d28c9eff05bee6befb21035b085b2804fb6c436e479: Status 404 returned error can't find the container with id fe12c518bc804c01de3b3d28c9eff05bee6befb21035b085b2804fb6c436e479 Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.865395 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.865443 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.865461 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.865490 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.865509 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:50Z","lastTransitionTime":"2026-03-20T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:50 crc kubenswrapper[4903]: W0320 08:24:50.882539 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3a30d7_d1f1_491a_8e04_70eea2d35867.slice/crio-5bd4a372667d4baf9a4202b8e6558b22e4471ecea5b34375d6c6dced6837dc04 WatchSource:0}: Error finding container 5bd4a372667d4baf9a4202b8e6558b22e4471ecea5b34375d6c6dced6837dc04: Status 404 returned error can't find the container with id 5bd4a372667d4baf9a4202b8e6558b22e4471ecea5b34375d6c6dced6837dc04 Mar 20 08:24:50 crc kubenswrapper[4903]: W0320 08:24:50.910900 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod157214e8_fbfe_4e9d_98f4_02680437b8b2.slice/crio-00d95d19399b9744b191ea3eaf5558e95d28488a5dae61846ff9ca9ee4eaf4fc WatchSource:0}: Error finding container 00d95d19399b9744b191ea3eaf5558e95d28488a5dae61846ff9ca9ee4eaf4fc: Status 404 returned error can't find the container with id 00d95d19399b9744b191ea3eaf5558e95d28488a5dae61846ff9ca9ee4eaf4fc Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.968830 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.968869 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.968880 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.968901 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:50 crc kubenswrapper[4903]: I0320 08:24:50.968913 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:50Z","lastTransitionTime":"2026-03-20T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.021387 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerStarted","Data":"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.021475 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerStarted","Data":"00d95d19399b9744b191ea3eaf5558e95d28488a5dae61846ff9ca9ee4eaf4fc"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.023100 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"fe12c518bc804c01de3b3d28c9eff05bee6befb21035b085b2804fb6c436e479"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.024842 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mxv95" event={"ID":"bc3a30d7-d1f1-491a-8e04-70eea2d35867","Type":"ContainerStarted","Data":"5bd4a372667d4baf9a4202b8e6558b22e4471ecea5b34375d6c6dced6837dc04"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.028604 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" event={"ID":"867ac3a2-4567-43d3-80af-23021ced20b6","Type":"ContainerStarted","Data":"e2ffcc6d3fd95546077fb44fb1e3c343cc2d26542c4814ad9647de8f5f81eb0a"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.030000 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"15979e65a347710fabf0d74766f9affac6e390da796139447d1042e0e55c57ea"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.032944 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"81fcfbc938c8aa56218e828e73797f763b8100603f9623edc790db5eea704fde"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.034463 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1c15bc5da64bf1abd649a0c7b55d4923ffe47dfc8027301d49c8271b0c77d26b"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.035746 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-55qrw" event={"ID":"1a0b2ecd-79af-4d88-ac59-3a08385882a1","Type":"ContainerStarted","Data":"147591a8834ed8d4ba8b91cff015d87dceeeb7a6482d2013fd138724e8005435"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.037954 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.038142 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.038186 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nzq6s" event={"ID":"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e","Type":"ContainerStarted","Data":"97c1a73e17c515e5cb87749372e03f69b3917129ebd9cb13de334353535e90b6"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.051743 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.065801 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.074292 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.074323 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.074331 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.074348 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.074357 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:51Z","lastTransitionTime":"2026-03-20T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.076595 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nzq6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3018f573232eb4d02452fff1c433c9afac6c9c55d49063e1db014d696efef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nzq6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.085191 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e67af70-4211-4077-8f2b-0a00b8069e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rlvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rlvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2ndsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.096159 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.105125 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.110050 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.110170 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.110208 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:24:52.1101793 +0000 UTC m=+117.327079625 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.110259 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.110280 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.110308 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.110485 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:52.110470989 +0000 UTC m=+117.327371304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.110518 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:52.11050854 +0000 UTC m=+117.327408845 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.115182 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.122720 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mxv95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3a30d7-d1f1-491a-8e04-70eea2d35867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnpkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mxv95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.137323 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-55qrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0b2ecd-79af-4d88-ac59-3a08385882a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wx8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-55qrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.147149 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e53791c9-7f9f-4ce5-8c13-29786721b9e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T08:24:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 08:24:47.109709 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 08:24:47.109817 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 08:24:47.110391 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2254287008/tls.crt::/tmp/serving-cert-2254287008/tls.key\\\\\\\"\\\\nI0320 08:24:47.892679 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 08:24:47.894329 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 08:24:47.894348 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 08:24:47.894369 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 08:24:47.894375 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 08:24:47.900498 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 08:24:47.900532 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 08:24:47.900537 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 08:24:47.900544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 08:24:47.900554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 08:24:47.900558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 08:24:47.900562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 08:24:47.900763 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 08:24:47.902854 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T08:24:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:22:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:22:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:22:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.160389 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.173655 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"867ac3a2-4567-43d3-80af-23021ced20b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qp7cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.179207 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.179245 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.179255 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.179272 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.179284 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:51Z","lastTransitionTime":"2026-03-20T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.189088 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"157214e8-fbfe-4e9d-98f4-02680437b8b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m6k77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.211180 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.211218 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.211365 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.211378 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.211426 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:52.211411488 +0000 UTC m=+117.428311793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.211448 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.211716 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.211762 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.211781 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:51 crc kubenswrapper[4903]: E0320 08:24:51.211881 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:52.21185146 +0000 UTC m=+117.428751965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.282857 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.282919 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.282933 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.282954 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.282966 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:51Z","lastTransitionTime":"2026-03-20T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.386722 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.386784 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.386795 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.386812 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.386822 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:51Z","lastTransitionTime":"2026-03-20T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.468234 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq"] Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.468778 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.470686 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.471479 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.479694 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.489534 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.489598 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.489611 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.489632 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.489649 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:51Z","lastTransitionTime":"2026-03-20T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.492980 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"867ac3a2-4567-43d3-80af-23021ced20b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qp7cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.495000 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.495831 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.497825 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.498763 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.500203 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.500871 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.501684 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.503260 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.504201 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.506201 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.507336 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.509970 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.511075 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.512255 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.513775 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.514328 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/076f7b64-7577-4f40-87a2-cd5dfab6b688-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.514375 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.514401 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/076f7b64-7577-4f40-87a2-cd5dfab6b688-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.514426 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/076f7b64-7577-4f40-87a2-cd5dfab6b688-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.514446 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l65hw\" (UniqueName: \"kubernetes.io/projected/076f7b64-7577-4f40-87a2-cd5dfab6b688-kube-api-access-l65hw\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.514317 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"157214e8-fbfe-4e9d-98f4-02680437b8b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m6k77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.515672 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.516155 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.516916 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.518162 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.518660 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.519649 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.520129 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.521351 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.521813 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.522457 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.523557 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.524056 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.524972 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.525601 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.526595 4903 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.526724 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.528617 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.529826 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.530282 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.532023 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.532698 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.533756 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.534520 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.535585 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.535656 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e53791c9-7f9f-4ce5-8c13-29786721b9e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T08:24:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 08:24:47.109709 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 08:24:47.109817 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 08:24:47.110391 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2254287008/tls.crt::/tmp/serving-cert-2254287008/tls.key\\\\\\\"\\\\nI0320 08:24:47.892679 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 08:24:47.894329 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 08:24:47.894348 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 08:24:47.894369 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 08:24:47.894375 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 08:24:47.900498 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 08:24:47.900532 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 08:24:47.900537 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 08:24:47.900544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 08:24:47.900554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 08:24:47.900558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 08:24:47.900562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 08:24:47.900763 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 08:24:47.902854 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T08:24:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:22:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:22:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:22:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.536089 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.537337 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.538010 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.539065 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.539529 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.540425 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.540996 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.542150 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.542612 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.543533 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.543983 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.544871 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.545714 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.546368 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.552183 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nzq6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3018f573232eb4d02452fff1c433c9afac6c9c55d49063e1db014d696efef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nzq6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.562060 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e67af70-4211-4077-8f2b-0a00b8069e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rlvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rlvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2ndsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.577439 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"076f7b64-7577-4f40-87a2-cd5dfab6b688\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l65hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l65hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz8bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.590400 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.592879 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.592921 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.592953 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.592970 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.592980 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:51Z","lastTransitionTime":"2026-03-20T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.601830 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.611893 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.615545 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/076f7b64-7577-4f40-87a2-cd5dfab6b688-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.615657 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/076f7b64-7577-4f40-87a2-cd5dfab6b688-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.615691 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/076f7b64-7577-4f40-87a2-cd5dfab6b688-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.615711 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l65hw\" (UniqueName: \"kubernetes.io/projected/076f7b64-7577-4f40-87a2-cd5dfab6b688-kube-api-access-l65hw\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.616677 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/076f7b64-7577-4f40-87a2-cd5dfab6b688-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.616715 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/076f7b64-7577-4f40-87a2-cd5dfab6b688-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.622780 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.623349 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/076f7b64-7577-4f40-87a2-cd5dfab6b688-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.630373 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mxv95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3a30d7-d1f1-491a-8e04-70eea2d35867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnpkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mxv95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.632674 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l65hw\" (UniqueName: \"kubernetes.io/projected/076f7b64-7577-4f40-87a2-cd5dfab6b688-kube-api-access-l65hw\") pod \"ovnkube-control-plane-749d76644c-pz8bq\" (UID: \"076f7b64-7577-4f40-87a2-cd5dfab6b688\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.640431 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.649571 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-55qrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0b2ecd-79af-4d88-ac59-3a08385882a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wx8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-55qrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.696532 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.696581 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.696599 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.696626 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.696644 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:51Z","lastTransitionTime":"2026-03-20T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.803706 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.803752 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.803763 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.803779 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.803789 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:51Z","lastTransitionTime":"2026-03-20T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.809073 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.906355 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.906395 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.906407 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.906424 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:51 crc kubenswrapper[4903]: I0320 08:24:51.906437 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:51Z","lastTransitionTime":"2026-03-20T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.009335 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.009374 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.009383 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.009398 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.009409 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:52Z","lastTransitionTime":"2026-03-20T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.049424 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mxv95" event={"ID":"bc3a30d7-d1f1-491a-8e04-70eea2d35867","Type":"ContainerStarted","Data":"a416843a351842e8a9455e5f9c527a4a6872adedbc64f2804004bb832dcaa21b"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.055529 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d8352a1159b3b3c2acc0b0306b5d5fab2b5516e3464d05ac1e3842d615eb0b80"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.055594 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"11423302a2f0ddfe4b87e1f95bce53b3d3922f9508b7aae57fe75839eb4bdecf"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.058702 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"da47a881a7498cb8b452217bd1fac181623c56a1e93d48ac883127eacdd3b491"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.060150 4903 generic.go:334] "Generic (PLEG): container finished" podID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerID="9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362" exitCode=0 Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.060220 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerDied","Data":"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.063975 4903 generic.go:334] "Generic (PLEG): container finished" podID="867ac3a2-4567-43d3-80af-23021ced20b6" containerID="714ca644c6c9cdee3b77644396aed5a9f7a25ba37e4f3560a926aa97806f96bd" exitCode=0 Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.064292 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" event={"ID":"867ac3a2-4567-43d3-80af-23021ced20b6","Type":"ContainerDied","Data":"714ca644c6c9cdee3b77644396aed5a9f7a25ba37e4f3560a926aa97806f96bd"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.069112 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nzq6s" event={"ID":"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e","Type":"ContainerStarted","Data":"dc3018f573232eb4d02452fff1c433c9afac6c9c55d49063e1db014d696efef1"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.070394 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.072889 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" event={"ID":"076f7b64-7577-4f40-87a2-cd5dfab6b688","Type":"ContainerStarted","Data":"3c20092bc87bb3a4c92736828ca71de1e59ff049b400c2e1732bd3f855a1db4c"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.072923 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" event={"ID":"076f7b64-7577-4f40-87a2-cd5dfab6b688","Type":"ContainerStarted","Data":"8fd8862f2163a9eaf957abf0d3530e322740bead3824f41d1a7ec755fe96e6ff"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.076680 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"97230b86d8abf05de23db14ac7a3f5d775800a1072bcb8f41fc0bb22c84b0942"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.076725 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"4916eb9b822c64a41b67853a8700177499bc2494a9e800bc0de83c6d8bb51ba2"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.079711 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-55qrw" event={"ID":"1a0b2ecd-79af-4d88-ac59-3a08385882a1","Type":"ContainerStarted","Data":"69a817710496023d3187d1a90a58586037754154eeffecef97e6a0afc4735a05"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.085225 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.106832 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.113705 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.113747 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.113759 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.113779 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.113795 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:52Z","lastTransitionTime":"2026-03-20T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.121248 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.121568 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.121605 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:24:54.121560731 +0000 UTC m=+119.338461086 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.121867 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.123590 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mxv95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3a30d7-d1f1-491a-8e04-70eea2d35867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a416843a351842e8a9455e5f9c527a4a6872adedbc64f2804004bb832dcaa21b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnpkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mxv95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.123820 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.123887 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:54.123865941 +0000 UTC m=+119.340766476 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.125066 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.125130 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:54.125118014 +0000 UTC m=+119.342018519 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.145382 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-55qrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0b2ecd-79af-4d88-ac59-3a08385882a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wx8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-55qrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.166382 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.183260 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"867ac3a2-4567-43d3-80af-23021ced20b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qp7cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.206946 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"157214e8-fbfe-4e9d-98f4-02680437b8b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m6k77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.216796 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.216834 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.216843 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.216859 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.216870 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:52Z","lastTransitionTime":"2026-03-20T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.221608 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9wqdz"] Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.222297 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.222365 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9wqdz" podUID="63d81fb4-627a-4062-b46b-bc0df9489a15" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.224720 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.224806 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.225058 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.225086 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.225105 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.225186 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:54.225158831 +0000 UTC m=+119.442059166 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.225677 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.225696 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.225706 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.225741 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:54.225728465 +0000 UTC m=+119.442628790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.226621 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e53791c9-7f9f-4ce5-8c13-29786721b9e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T08:24:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 08:24:47.109709 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 08:24:47.109817 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 08:24:47.110391 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2254287008/tls.crt::/tmp/serving-cert-2254287008/tls.key\\\\\\\"\\\\nI0320 08:24:47.892679 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 08:24:47.894329 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 08:24:47.894348 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 08:24:47.894369 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 08:24:47.894375 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 08:24:47.900498 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 08:24:47.900532 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 08:24:47.900537 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 08:24:47.900544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 08:24:47.900554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 08:24:47.900558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 08:24:47.900562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 08:24:47.900763 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 08:24:47.902854 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T08:24:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:22:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:22:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:22:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.242601 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.256677 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nzq6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3018f573232eb4d02452fff1c433c9afac6c9c55d49063e1db014d696efef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nzq6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.268308 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e67af70-4211-4077-8f2b-0a00b8069e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rlvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rlvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2ndsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.280479 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"076f7b64-7577-4f40-87a2-cd5dfab6b688\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l65hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l65hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz8bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.299129 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.310818 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-55qrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0b2ecd-79af-4d88-ac59-3a08385882a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69a817710496023d3187d1a90a58586037754154eeffecef97e6a0afc4735a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wx8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-55qrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.320669 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.320710 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.320721 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.320740 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.320753 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:52Z","lastTransitionTime":"2026-03-20T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.326168 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dlrn\" (UniqueName: \"kubernetes.io/projected/63d81fb4-627a-4062-b46b-bc0df9489a15-kube-api-access-4dlrn\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.326242 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.326898 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e53791c9-7f9f-4ce5-8c13-29786721b9e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:22:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T08:24:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 08:24:47.109709 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 08:24:47.109817 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 08:24:47.110391 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2254287008/tls.crt::/tmp/serving-cert-2254287008/tls.key\\\\\\\"\\\\nI0320 08:24:47.892679 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 08:24:47.894329 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 08:24:47.894348 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 08:24:47.894369 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 08:24:47.894375 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 08:24:47.900498 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 08:24:47.900532 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 08:24:47.900537 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 08:24:47.900544 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 08:24:47.900554 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 08:24:47.900558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 08:24:47.900562 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 08:24:47.900763 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 08:24:47.902854 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T08:24:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:22:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:22:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:22:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:22:55Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.343013 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.360331 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"867ac3a2-4567-43d3-80af-23021ced20b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://714ca644c6c9cdee3b77644396aed5a9f7a25ba37e4f3560a926aa97806f96bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714ca644c6c9cdee3b77644396aed5a9f7a25ba37e4f3560a926aa97806f96bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2jpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qp7cv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.385135 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"157214e8-fbfe-4e9d-98f4-02680437b8b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9z897\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m6k77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.402229 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da47a881a7498cb8b452217bd1fac181623c56a1e93d48ac883127eacdd3b491\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.418905 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.424289 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.424326 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.424336 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.424351 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.424361 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:52Z","lastTransitionTime":"2026-03-20T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.426902 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dlrn\" (UniqueName: \"kubernetes.io/projected/63d81fb4-627a-4062-b46b-bc0df9489a15-kube-api-access-4dlrn\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.426962 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.427113 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.427183 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs podName:63d81fb4-627a-4062-b46b-bc0df9489a15 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:52.927167004 +0000 UTC m=+118.144067319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs") pod "network-metrics-daemon-9wqdz" (UID: "63d81fb4-627a-4062-b46b-bc0df9489a15") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.439092 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nzq6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc3018f573232eb4d02452fff1c433c9afac6c9c55d49063e1db014d696efef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nzq6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.448117 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dlrn\" (UniqueName: \"kubernetes.io/projected/63d81fb4-627a-4062-b46b-bc0df9489a15-kube-api-access-4dlrn\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.453252 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e67af70-4211-4077-8f2b-0a00b8069e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4916eb9b822c64a41b67853a8700177499bc2494a9e800bc0de83c6d8bb51ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rlvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://97230b86d8abf05de23db14ac7a3f5d775800a1072bcb8f41fc0bb22c84b0942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5rlvr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2ndsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.466118 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"076f7b64-7577-4f40-87a2-cd5dfab6b688\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l65hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l65hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz8bq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.490126 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.490217 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.490217 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.490368 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.490409 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.490240 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.490491 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.517569 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.527508 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.527553 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.527567 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.527585 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.527599 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:52Z","lastTransitionTime":"2026-03-20T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.533551 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11423302a2f0ddfe4b87e1f95bce53b3d3922f9508b7aae57fe75839eb4bdecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8352a1159b3b3c2acc0b0306b5d5fab2b5516e3464d05ac1e3842d615eb0b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.548401 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mxv95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc3a30d7-d1f1-491a-8e04-70eea2d35867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a416843a351842e8a9455e5f9c527a4a6872adedbc64f2804004bb832dcaa21b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hnpkw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mxv95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.565286 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9wqdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d81fb4-627a-4062-b46b-bc0df9489a15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dlrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4dlrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9wqdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.630109 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.630159 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.630172 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.630195 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.630210 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:52Z","lastTransitionTime":"2026-03-20T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.734163 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.734210 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.734221 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.734432 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.734445 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:52Z","lastTransitionTime":"2026-03-20T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.840965 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.841006 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.841063 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.841088 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.841129 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:52Z","lastTransitionTime":"2026-03-20T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.932734 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.932896 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: E0320 08:24:52.932942 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs podName:63d81fb4-627a-4062-b46b-bc0df9489a15 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:53.932929021 +0000 UTC m=+119.149829336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs") pod "network-metrics-daemon-9wqdz" (UID: "63d81fb4-627a-4062-b46b-bc0df9489a15") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.949354 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.949405 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.949415 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.949436 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:52 crc kubenswrapper[4903]: I0320 08:24:52.949448 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:52Z","lastTransitionTime":"2026-03-20T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.054807 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.055323 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.055338 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.055355 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.055366 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:53Z","lastTransitionTime":"2026-03-20T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.094909 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" event={"ID":"076f7b64-7577-4f40-87a2-cd5dfab6b688","Type":"ContainerStarted","Data":"d648d65c7e2b3e2704faf71fefc7eeb689f39c48deab99d28c0d641b14d479a3"} Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.099324 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerStarted","Data":"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378"} Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.099380 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerStarted","Data":"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb"} Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.099392 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerStarted","Data":"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10"} Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.099403 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerStarted","Data":"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0"} Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.101435 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" event={"ID":"867ac3a2-4567-43d3-80af-23021ced20b6","Type":"ContainerStarted","Data":"349dd22a6942eaf5db18567fafa21385742707c09ba7379ad875874effcca16a"} Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.118265 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-55qrw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a0b2ecd-79af-4d88-ac59-3a08385882a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69a817710496023d3187d1a90a58586037754154eeffecef97e6a0afc4735a05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:24:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wx8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:24:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-55qrw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.139213 4903 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T08:24:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.158098 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.158133 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.158142 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.158156 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.158167 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:53Z","lastTransitionTime":"2026-03-20T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.232442 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.232489 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.232498 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.232517 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.232526 4903 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T08:24:53Z","lastTransitionTime":"2026-03-20T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.286743 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podStartSLOduration=57.286716524 podStartE2EDuration="57.286716524s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:24:53.285885231 +0000 UTC m=+118.502785566" watchObservedRunningTime="2026-03-20 08:24:53.286716524 +0000 UTC m=+118.503616839" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.287260 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nzq6s" podStartSLOduration=57.287254497 podStartE2EDuration="57.287254497s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:24:53.273849055 +0000 UTC m=+118.490749370" watchObservedRunningTime="2026-03-20 08:24:53.287254497 +0000 UTC m=+118.504154812" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.289173 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx"] Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.289643 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.291566 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.294628 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.294807 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.294962 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.322141 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz8bq" podStartSLOduration=56.322115886 podStartE2EDuration="56.322115886s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:24:53.307718067 +0000 UTC m=+118.524618382" watchObservedRunningTime="2026-03-20 08:24:53.322115886 +0000 UTC m=+118.539016201" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.336095 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.336163 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.336187 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.336205 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.336238 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.376269 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mxv95" podStartSLOduration=57.376241433 podStartE2EDuration="57.376241433s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:24:53.375722379 +0000 UTC m=+118.592622694" watchObservedRunningTime="2026-03-20 08:24:53.376241433 +0000 UTC m=+118.593141748" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.412692 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-55qrw" podStartSLOduration=57.412667393 podStartE2EDuration="57.412667393s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:24:53.41255204 +0000 UTC m=+118.629452365" watchObservedRunningTime="2026-03-20 08:24:53.412667393 +0000 UTC m=+118.629567708" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.437758 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.437913 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.438001 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.438075 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.438114 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.438244 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.438357 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.438957 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.447475 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.455805 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91acc5a4-09d5-4f88-86f4-158b1bfe46b8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-szjxx\" (UID: \"91acc5a4-09d5-4f88-86f4-158b1bfe46b8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.478999 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.485351 4903 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.602114 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" Mar 20 08:24:53 crc kubenswrapper[4903]: W0320 08:24:53.632389 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91acc5a4_09d5_4f88_86f4_158b1bfe46b8.slice/crio-a15c9613116916b65c7c5a96cdcb67aeb06c5980bc95c695c6026d94cf98e77d WatchSource:0}: Error finding container a15c9613116916b65c7c5a96cdcb67aeb06c5980bc95c695c6026d94cf98e77d: Status 404 returned error can't find the container with id a15c9613116916b65c7c5a96cdcb67aeb06c5980bc95c695c6026d94cf98e77d Mar 20 08:24:53 crc kubenswrapper[4903]: I0320 08:24:53.943256 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:53 crc kubenswrapper[4903]: E0320 08:24:53.943528 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:24:53 crc kubenswrapper[4903]: E0320 08:24:53.943717 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs podName:63d81fb4-627a-4062-b46b-bc0df9489a15 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:55.943678075 +0000 UTC m=+121.160578580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs") pod "network-metrics-daemon-9wqdz" (UID: "63d81fb4-627a-4062-b46b-bc0df9489a15") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.107980 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"194b04360d76c1ee938a1dd8ab22b0ff12b4f31e74cc0cc1796b3c09a84de844"} Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.111186 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" event={"ID":"91acc5a4-09d5-4f88-86f4-158b1bfe46b8","Type":"ContainerStarted","Data":"d78a70ae1b5b70392213eaa10a6281dcdd1d92b3a75db54e4ed3d1b64eb6356e"} Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.111238 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" event={"ID":"91acc5a4-09d5-4f88-86f4-158b1bfe46b8","Type":"ContainerStarted","Data":"a15c9613116916b65c7c5a96cdcb67aeb06c5980bc95c695c6026d94cf98e77d"} Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.123712 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerStarted","Data":"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3"} Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.123810 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerStarted","Data":"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea"} Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.126890 4903 generic.go:334] "Generic (PLEG): container finished" podID="867ac3a2-4567-43d3-80af-23021ced20b6" containerID="349dd22a6942eaf5db18567fafa21385742707c09ba7379ad875874effcca16a" exitCode=0 Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.127310 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" event={"ID":"867ac3a2-4567-43d3-80af-23021ced20b6","Type":"ContainerDied","Data":"349dd22a6942eaf5db18567fafa21385742707c09ba7379ad875874effcca16a"} Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.148150 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.148398 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:24:58.148349788 +0000 UTC m=+123.365250183 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.148536 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.148747 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.149947 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.150064 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:58.150012562 +0000 UTC m=+123.366913117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.149974 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.150728 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:58.150714901 +0000 UTC m=+123.367615226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.187974 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-szjxx" podStartSLOduration=58.187938691 podStartE2EDuration="58.187938691s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:24:54.147741553 +0000 UTC m=+119.364641878" watchObservedRunningTime="2026-03-20 08:24:54.187938691 +0000 UTC m=+119.404839046" Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.249680 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.249756 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.249948 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.249973 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.249989 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.250080 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:58.250058649 +0000 UTC m=+123.466958974 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.250435 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.250489 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.250841 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.250955 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:58.250925021 +0000 UTC m=+123.467825346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.489871 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.489911 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.489937 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.489986 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.490424 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.490733 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9wqdz" podUID="63d81fb4-627a-4062-b46b-bc0df9489a15" Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.490884 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 08:24:54 crc kubenswrapper[4903]: E0320 08:24:54.490969 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 08:24:54 crc kubenswrapper[4903]: I0320 08:24:54.498334 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 08:24:55 crc kubenswrapper[4903]: I0320 08:24:55.132475 4903 generic.go:334] "Generic (PLEG): container finished" podID="867ac3a2-4567-43d3-80af-23021ced20b6" containerID="2237c8f9fb3ac56a7aede2f6a7793bfe990aa2945ac38d0d8601f60cce6b33fc" exitCode=0 Mar 20 08:24:55 crc kubenswrapper[4903]: I0320 08:24:55.132704 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" event={"ID":"867ac3a2-4567-43d3-80af-23021ced20b6","Type":"ContainerDied","Data":"2237c8f9fb3ac56a7aede2f6a7793bfe990aa2945ac38d0d8601f60cce6b33fc"} Mar 20 08:24:55 crc kubenswrapper[4903]: I0320 08:24:55.178831 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.178802522 podStartE2EDuration="1.178802522s" podCreationTimestamp="2026-03-20 08:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:24:55.154153503 +0000 UTC m=+120.371053818" watchObservedRunningTime="2026-03-20 08:24:55.178802522 +0000 UTC m=+120.395702837" Mar 20 08:24:55 crc kubenswrapper[4903]: E0320 08:24:55.434475 4903 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 08:24:55 crc kubenswrapper[4903]: E0320 08:24:55.600063 4903 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:24:55 crc kubenswrapper[4903]: I0320 08:24:55.970962 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:55 crc kubenswrapper[4903]: E0320 08:24:55.971215 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:24:55 crc kubenswrapper[4903]: E0320 08:24:55.971329 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs podName:63d81fb4-627a-4062-b46b-bc0df9489a15 nodeName:}" failed. No retries permitted until 2026-03-20 08:24:59.971304585 +0000 UTC m=+125.188204900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs") pod "network-metrics-daemon-9wqdz" (UID: "63d81fb4-627a-4062-b46b-bc0df9489a15") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:24:56 crc kubenswrapper[4903]: I0320 08:24:56.139628 4903 generic.go:334] "Generic (PLEG): container finished" podID="867ac3a2-4567-43d3-80af-23021ced20b6" containerID="f273923479894cdbb3863a2420246cd06062699c3e2b478665306743a154b42b" exitCode=0 Mar 20 08:24:56 crc kubenswrapper[4903]: I0320 08:24:56.139738 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" event={"ID":"867ac3a2-4567-43d3-80af-23021ced20b6","Type":"ContainerDied","Data":"f273923479894cdbb3863a2420246cd06062699c3e2b478665306743a154b42b"} Mar 20 08:24:56 crc kubenswrapper[4903]: I0320 08:24:56.146429 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerStarted","Data":"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a"} Mar 20 08:24:56 crc kubenswrapper[4903]: I0320 08:24:56.490586 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:24:56 crc kubenswrapper[4903]: I0320 08:24:56.490672 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:24:56 crc kubenswrapper[4903]: I0320 08:24:56.490619 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:56 crc kubenswrapper[4903]: I0320 08:24:56.490594 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:56 crc kubenswrapper[4903]: E0320 08:24:56.490790 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 08:24:56 crc kubenswrapper[4903]: E0320 08:24:56.490903 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 08:24:56 crc kubenswrapper[4903]: E0320 08:24:56.490985 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 08:24:56 crc kubenswrapper[4903]: E0320 08:24:56.491064 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9wqdz" podUID="63d81fb4-627a-4062-b46b-bc0df9489a15" Mar 20 08:24:57 crc kubenswrapper[4903]: I0320 08:24:57.153537 4903 generic.go:334] "Generic (PLEG): container finished" podID="867ac3a2-4567-43d3-80af-23021ced20b6" containerID="53d7dc6a5472c511445ecbcbee6b2d646767952bf17d968ee972cf92e547e4c4" exitCode=0 Mar 20 08:24:57 crc kubenswrapper[4903]: I0320 08:24:57.153585 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" event={"ID":"867ac3a2-4567-43d3-80af-23021ced20b6","Type":"ContainerDied","Data":"53d7dc6a5472c511445ecbcbee6b2d646767952bf17d968ee972cf92e547e4c4"} Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.164899 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerStarted","Data":"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996"} Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.165396 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.171062 4903 generic.go:334] "Generic (PLEG): container finished" podID="867ac3a2-4567-43d3-80af-23021ced20b6" containerID="1c4eaa1294159e8146285afeba1ad53f8e850bd83aabc6c0fb4b20b298b148e5" exitCode=0 Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.171126 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" event={"ID":"867ac3a2-4567-43d3-80af-23021ced20b6","Type":"ContainerDied","Data":"1c4eaa1294159e8146285afeba1ad53f8e850bd83aabc6c0fb4b20b298b148e5"} Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.201222 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.201439 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.201473 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:06.201433571 +0000 UTC m=+131.418333916 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.201628 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.201765 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:25:06.201726799 +0000 UTC m=+131.418627354 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.201868 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.202150 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.202242 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:25:06.202216742 +0000 UTC m=+131.419117097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.233078 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.258763 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" podStartSLOduration=61.258740931 podStartE2EDuration="1m1.258740931s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:24:58.212944554 +0000 UTC m=+123.429844889" watchObservedRunningTime="2026-03-20 08:24:58.258740931 +0000 UTC m=+123.475641246" Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.303170 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.303304 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.303439 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.303496 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.303508 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.303595 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 08:25:06.303575173 +0000 UTC m=+131.520475488 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.303670 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.303720 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.303747 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.303857 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 08:25:06.303817609 +0000 UTC m=+131.520717974 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.490400 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.490516 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.491803 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:24:58 crc kubenswrapper[4903]: I0320 08:24:58.491683 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.492012 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.492210 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9wqdz" podUID="63d81fb4-627a-4062-b46b-bc0df9489a15" Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.492381 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 08:24:58 crc kubenswrapper[4903]: E0320 08:24:58.498453 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 08:24:59 crc kubenswrapper[4903]: I0320 08:24:59.182623 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" event={"ID":"867ac3a2-4567-43d3-80af-23021ced20b6","Type":"ContainerStarted","Data":"419d33a9e16d7e528fa8cce87671dff84269d3c0722af91c4cf86e82ee84d905"} Mar 20 08:24:59 crc kubenswrapper[4903]: I0320 08:24:59.183582 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:59 crc kubenswrapper[4903]: I0320 08:24:59.183663 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:24:59 crc kubenswrapper[4903]: I0320 08:24:59.213328 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qp7cv" podStartSLOduration=63.213298725 podStartE2EDuration="1m3.213298725s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:24:59.212314359 +0000 UTC m=+124.429214714" watchObservedRunningTime="2026-03-20 08:24:59.213298725 +0000 UTC m=+124.430199060" Mar 20 08:24:59 crc kubenswrapper[4903]: I0320 08:24:59.256761 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:25:00 crc kubenswrapper[4903]: I0320 08:25:00.021420 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:25:00 crc kubenswrapper[4903]: E0320 08:25:00.021647 4903 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:25:00 crc kubenswrapper[4903]: E0320 08:25:00.021767 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs podName:63d81fb4-627a-4062-b46b-bc0df9489a15 nodeName:}" failed. No retries permitted until 2026-03-20 08:25:08.021737938 +0000 UTC m=+133.238638263 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs") pod "network-metrics-daemon-9wqdz" (UID: "63d81fb4-627a-4062-b46b-bc0df9489a15") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 08:25:00 crc kubenswrapper[4903]: I0320 08:25:00.080145 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9wqdz"] Mar 20 08:25:00 crc kubenswrapper[4903]: I0320 08:25:00.080307 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:25:00 crc kubenswrapper[4903]: E0320 08:25:00.080423 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9wqdz" podUID="63d81fb4-627a-4062-b46b-bc0df9489a15" Mar 20 08:25:00 crc kubenswrapper[4903]: I0320 08:25:00.490011 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:00 crc kubenswrapper[4903]: I0320 08:25:00.490066 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:25:00 crc kubenswrapper[4903]: I0320 08:25:00.490119 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:25:00 crc kubenswrapper[4903]: E0320 08:25:00.490185 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 08:25:00 crc kubenswrapper[4903]: E0320 08:25:00.490373 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 08:25:00 crc kubenswrapper[4903]: E0320 08:25:00.490620 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 08:25:00 crc kubenswrapper[4903]: E0320 08:25:00.603098 4903 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:25:01 crc kubenswrapper[4903]: I0320 08:25:01.490487 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:25:01 crc kubenswrapper[4903]: E0320 08:25:01.490721 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9wqdz" podUID="63d81fb4-627a-4062-b46b-bc0df9489a15" Mar 20 08:25:01 crc kubenswrapper[4903]: I0320 08:25:01.491814 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:25:01 crc kubenswrapper[4903]: E0320 08:25:01.492244 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:25:02 crc kubenswrapper[4903]: I0320 08:25:02.490344 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:25:02 crc kubenswrapper[4903]: I0320 08:25:02.490408 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:25:02 crc kubenswrapper[4903]: I0320 08:25:02.490430 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:02 crc kubenswrapper[4903]: E0320 08:25:02.490532 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 08:25:02 crc kubenswrapper[4903]: E0320 08:25:02.490638 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 08:25:02 crc kubenswrapper[4903]: E0320 08:25:02.490797 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 08:25:03 crc kubenswrapper[4903]: I0320 08:25:03.490674 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:25:03 crc kubenswrapper[4903]: E0320 08:25:03.491377 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9wqdz" podUID="63d81fb4-627a-4062-b46b-bc0df9489a15" Mar 20 08:25:04 crc kubenswrapper[4903]: I0320 08:25:04.490181 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:25:04 crc kubenswrapper[4903]: I0320 08:25:04.490233 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:25:04 crc kubenswrapper[4903]: I0320 08:25:04.490202 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:04 crc kubenswrapper[4903]: E0320 08:25:04.490369 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 08:25:04 crc kubenswrapper[4903]: E0320 08:25:04.490445 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 08:25:04 crc kubenswrapper[4903]: E0320 08:25:04.490645 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 08:25:05 crc kubenswrapper[4903]: I0320 08:25:05.490224 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:25:05 crc kubenswrapper[4903]: E0320 08:25:05.491410 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9wqdz" podUID="63d81fb4-627a-4062-b46b-bc0df9489a15" Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.204130 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.204413 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.204368456 +0000 UTC m=+147.421268781 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.204958 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.205188 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.205200 4903 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.205493 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.205469296 +0000 UTC m=+147.422369621 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.205359 4903 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.205731 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.205714912 +0000 UTC m=+147.422615277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.306678 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.307153 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.306980 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.307639 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.307787 4903 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.307259 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.308071 4903 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.308091 4903 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.308435 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.308009187 +0000 UTC m=+147.524909542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:25:06 crc kubenswrapper[4903]: E0320 08:25:06.308687 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.308662784 +0000 UTC m=+147.525563139 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.490878 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.490907 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.491117 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.496506 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.496713 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.497186 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 08:25:06 crc kubenswrapper[4903]: I0320 08:25:06.497833 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 08:25:07 crc kubenswrapper[4903]: I0320 08:25:07.490848 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:25:07 crc kubenswrapper[4903]: I0320 08:25:07.499542 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 08:25:07 crc kubenswrapper[4903]: I0320 08:25:07.499858 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 08:25:08 crc kubenswrapper[4903]: I0320 08:25:08.026988 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:25:08 crc kubenswrapper[4903]: I0320 08:25:08.040265 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63d81fb4-627a-4062-b46b-bc0df9489a15-metrics-certs\") pod \"network-metrics-daemon-9wqdz\" (UID: \"63d81fb4-627a-4062-b46b-bc0df9489a15\") " pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:25:08 crc kubenswrapper[4903]: I0320 08:25:08.116198 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9wqdz" Mar 20 08:25:08 crc kubenswrapper[4903]: I0320 08:25:08.365844 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9wqdz"] Mar 20 08:25:08 crc kubenswrapper[4903]: W0320 08:25:08.375134 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63d81fb4_627a_4062_b46b_bc0df9489a15.slice/crio-4d574163ef877bfa9ff9a8baabd872035e8aa84088f4fa871bea907717d94bd0 WatchSource:0}: Error finding container 4d574163ef877bfa9ff9a8baabd872035e8aa84088f4fa871bea907717d94bd0: Status 404 returned error can't find the container with id 4d574163ef877bfa9ff9a8baabd872035e8aa84088f4fa871bea907717d94bd0 Mar 20 08:25:09 crc kubenswrapper[4903]: I0320 08:25:09.247597 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9wqdz" event={"ID":"63d81fb4-627a-4062-b46b-bc0df9489a15","Type":"ContainerStarted","Data":"676afc57ee342ddea1afffba65906a67bfae533be4da5e8a98c34b6f106696b6"} Mar 20 08:25:09 crc kubenswrapper[4903]: I0320 08:25:09.248149 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9wqdz" event={"ID":"63d81fb4-627a-4062-b46b-bc0df9489a15","Type":"ContainerStarted","Data":"963acb28460f4dcf2d31c4a7504174cc2554e384d55b7b0f994548b1c9679a4e"} Mar 20 08:25:09 crc kubenswrapper[4903]: I0320 08:25:09.248175 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9wqdz" event={"ID":"63d81fb4-627a-4062-b46b-bc0df9489a15","Type":"ContainerStarted","Data":"4d574163ef877bfa9ff9a8baabd872035e8aa84088f4fa871bea907717d94bd0"} Mar 20 08:25:09 crc kubenswrapper[4903]: I0320 08:25:09.267931 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9wqdz" podStartSLOduration=72.267901369 podStartE2EDuration="1m12.267901369s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:09.266200221 +0000 UTC m=+134.483100566" watchObservedRunningTime="2026-03-20 08:25:09.267901369 +0000 UTC m=+134.484801714" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.574161 4903 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.656524 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mx9nb"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.657341 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.658008 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.657382 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.659131 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f47pf"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.659585 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.671973 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.675400 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qq7hw"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.675792 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.676054 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fcqtt"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.676601 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.677868 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.678380 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.681754 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.682159 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.682663 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.683482 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.683590 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.683815 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.684099 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.684522 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.685208 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.686010 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.686146 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.686148 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.686313 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.686339 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.686509 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.687576 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.687722 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.687787 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.687863 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.687969 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688065 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688221 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688252 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688288 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688339 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688365 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688366 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688394 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688408 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688507 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688561 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688576 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688594 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688515 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688695 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688704 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688710 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688811 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688853 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688906 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688861 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.688963 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.689090 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.689131 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.689327 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.689432 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.693645 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.693701 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.693731 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.696871 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/24298acb-4912-45c7-b28d-f8f7389bb7e6-encryption-config\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.697055 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24298acb-4912-45c7-b28d-f8f7389bb7e6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.697215 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsbhv\" (UniqueName: \"kubernetes.io/projected/24298acb-4912-45c7-b28d-f8f7389bb7e6-kube-api-access-vsbhv\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.697329 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pm7f\" (UniqueName: \"kubernetes.io/projected/a01da421-9bf1-459f-a419-c7cc271bf472-kube-api-access-2pm7f\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.697450 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24298acb-4912-45c7-b28d-f8f7389bb7e6-audit-policies\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.697557 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/572cb149-e6e4-4d1b-ab27-145239b82d1c-images\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.697665 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24298acb-4912-45c7-b28d-f8f7389bb7e6-serving-cert\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.697916 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/572cb149-e6e4-4d1b-ab27-145239b82d1c-config\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.698272 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x26p9\" (UniqueName: \"kubernetes.io/projected/572cb149-e6e4-4d1b-ab27-145239b82d1c-kube-api-access-x26p9\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.698362 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/24298acb-4912-45c7-b28d-f8f7389bb7e6-etcd-client\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.698444 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/24298acb-4912-45c7-b28d-f8f7389bb7e6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.698480 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-config\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.698510 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.698590 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24298acb-4912-45c7-b28d-f8f7389bb7e6-audit-dir\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.698625 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-client-ca\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.698650 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/572cb149-e6e4-4d1b-ab27-145239b82d1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.698679 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a01da421-9bf1-459f-a419-c7cc271bf472-serving-cert\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.710845 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.710917 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.732006 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.733963 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.740322 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.744281 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.745075 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.745258 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.745462 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.745700 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.751215 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.752360 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.753197 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.756668 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.757146 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.757373 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.757575 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.766399 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx6"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.770624 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.771055 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.771273 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-h4m4s"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.771554 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.771916 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.771993 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.772539 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.773204 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9wjwh"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.773753 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-n4g5v"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.774176 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-n4g5v" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.775162 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.779052 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pnnxx"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.779788 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.787969 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.788222 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.790072 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2q9m5"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.790821 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.792555 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.793116 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.793307 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.793655 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.793719 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.793843 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.793908 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.794095 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.794301 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.794378 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.794520 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.794599 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.794677 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.795239 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.797540 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.797912 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.797965 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.798077 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.798221 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.798799 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.798822 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.798947 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.798962 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799087 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799110 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799191 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799277 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799294 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799329 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/24298acb-4912-45c7-b28d-f8f7389bb7e6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799363 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-config\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799390 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799495 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799388 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799651 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-config\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799679 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d528a1a3-29b5-4b67-a9bd-e7102d857426-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s5fmp\" (UID: \"d528a1a3-29b5-4b67-a9bd-e7102d857426\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799726 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-image-import-ca\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799747 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k9kf\" (UniqueName: \"kubernetes.io/projected/d528a1a3-29b5-4b67-a9bd-e7102d857426-kube-api-access-9k9kf\") pod \"openshift-apiserver-operator-796bbdcf4f-s5fmp\" (UID: \"d528a1a3-29b5-4b67-a9bd-e7102d857426\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799767 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-etcd-serving-ca\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799792 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4218cb22-8dd6-45fd-8091-635dd8a305bf-serving-cert\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799810 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-config\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799827 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a21d7057-af5a-499a-b71f-3cae88b6883e-serving-cert\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799853 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24298acb-4912-45c7-b28d-f8f7389bb7e6-audit-dir\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799877 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-client-ca\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799894 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/572cb149-e6e4-4d1b-ab27-145239b82d1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799915 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-client-ca\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799938 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a01da421-9bf1-459f-a419-c7cc271bf472-serving-cert\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799956 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsrjz\" (UniqueName: \"kubernetes.io/projected/a21d7057-af5a-499a-b71f-3cae88b6883e-kube-api-access-fsrjz\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799973 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-config\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799996 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe5d5814-39d5-43a9-9352-f47c80603c75-serving-cert\") pod \"openshift-config-operator-7777fb866f-d5tc5\" (UID: \"fe5d5814-39d5-43a9-9352-f47c80603c75\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800015 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4218cb22-8dd6-45fd-8091-635dd8a305bf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800055 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-auth-proxy-config\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800082 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-etcd-client\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800103 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg8dw\" (UniqueName: \"kubernetes.io/projected/fe5d5814-39d5-43a9-9352-f47c80603c75-kube-api-access-wg8dw\") pod \"openshift-config-operator-7777fb866f-d5tc5\" (UID: \"fe5d5814-39d5-43a9-9352-f47c80603c75\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800126 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/24298acb-4912-45c7-b28d-f8f7389bb7e6-encryption-config\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800149 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24298acb-4912-45c7-b28d-f8f7389bb7e6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800166 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4218cb22-8dd6-45fd-8091-635dd8a305bf-config\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800198 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsbhv\" (UniqueName: \"kubernetes.io/projected/24298acb-4912-45c7-b28d-f8f7389bb7e6-kube-api-access-vsbhv\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800219 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d528a1a3-29b5-4b67-a9bd-e7102d857426-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s5fmp\" (UID: \"d528a1a3-29b5-4b67-a9bd-e7102d857426\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800259 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pm7f\" (UniqueName: \"kubernetes.io/projected/a01da421-9bf1-459f-a419-c7cc271bf472-kube-api-access-2pm7f\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800277 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k266g\" (UniqueName: \"kubernetes.io/projected/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-kube-api-access-k266g\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800300 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/572cb149-e6e4-4d1b-ab27-145239b82d1c-images\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800316 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp7v8\" (UniqueName: \"kubernetes.io/projected/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-kube-api-access-zp7v8\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800346 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24298acb-4912-45c7-b28d-f8f7389bb7e6-audit-policies\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800367 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-machine-approver-tls\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800380 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-audit\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800403 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24298acb-4912-45c7-b28d-f8f7389bb7e6-serving-cert\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800420 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/572cb149-e6e4-4d1b-ab27-145239b82d1c-config\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800435 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fe5d5814-39d5-43a9-9352-f47c80603c75-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d5tc5\" (UID: \"fe5d5814-39d5-43a9-9352-f47c80603c75\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800451 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-serving-cert\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800470 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x26p9\" (UniqueName: \"kubernetes.io/projected/572cb149-e6e4-4d1b-ab27-145239b82d1c-kube-api-access-x26p9\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800488 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4218cb22-8dd6-45fd-8091-635dd8a305bf-service-ca-bundle\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800509 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-node-pullsecrets\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800524 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/24298acb-4912-45c7-b28d-f8f7389bb7e6-etcd-client\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800541 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-audit-dir\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800559 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wxzz\" (UniqueName: \"kubernetes.io/projected/4218cb22-8dd6-45fd-8091-635dd8a305bf-kube-api-access-2wxzz\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800573 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-encryption-config\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800595 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.800708 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/24298acb-4912-45c7-b28d-f8f7389bb7e6-audit-dir\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.802933 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.803449 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.806466 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/24298acb-4912-45c7-b28d-f8f7389bb7e6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799541 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.799591 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.806311 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.806781 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.808888 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.809292 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.809491 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.809554 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/572cb149-e6e4-4d1b-ab27-145239b82d1c-images\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.809662 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.809776 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.810109 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/24298acb-4912-45c7-b28d-f8f7389bb7e6-audit-policies\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.810190 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.810207 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.830288 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.830730 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.832495 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.833544 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/24298acb-4912-45c7-b28d-f8f7389bb7e6-encryption-config\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.837140 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-config\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.837939 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24298acb-4912-45c7-b28d-f8f7389bb7e6-serving-cert\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.840025 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-cc4f6"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.840828 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/24298acb-4912-45c7-b28d-f8f7389bb7e6-etcd-client\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.841460 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/572cb149-e6e4-4d1b-ab27-145239b82d1c-config\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.841534 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.843393 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.844730 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24298acb-4912-45c7-b28d-f8f7389bb7e6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.846136 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.846182 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.846337 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.846549 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.861285 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.861942 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.862202 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.862618 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a01da421-9bf1-459f-a419-c7cc271bf472-serving-cert\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.862694 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.862857 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/572cb149-e6e4-4d1b-ab27-145239b82d1c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.863198 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.863304 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.863705 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.864135 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.864466 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.864764 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.869948 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5v8xb"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.870733 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbxgq"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.871253 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.871412 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.871511 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.871546 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.872280 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-client-ca\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.872338 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.872433 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.872710 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.872822 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.872853 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.873062 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.873147 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.873750 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.874130 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.874582 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lw2n7"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.875449 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.875700 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.876305 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5pjw9"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.876443 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.876709 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.877235 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.877791 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.878386 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.879112 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mx9nb"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.880487 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fcqtt"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.880991 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f47pf"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.882053 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qq7hw"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.883128 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.883460 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.885051 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.885770 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vc9fr"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.886824 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.886946 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.888021 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.888859 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9wjwh"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.890093 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-spjl5"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.891066 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.892766 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx6"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.893915 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.895232 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.898179 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.901129 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.902642 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.902746 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a7114dc-66fa-4519-a0b1-83e34b999f9b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-p8nlt\" (UID: \"5a7114dc-66fa-4519-a0b1-83e34b999f9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.902793 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a89ca6c-6d44-4fa5-b728-09aa58890f99-config\") pod \"service-ca-operator-777779d784-t7qmg\" (UID: \"6a89ca6c-6d44-4fa5-b728-09aa58890f99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.902838 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.902941 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-config\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.902971 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d528a1a3-29b5-4b67-a9bd-e7102d857426-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s5fmp\" (UID: \"d528a1a3-29b5-4b67-a9bd-e7102d857426\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903006 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-etcd-serving-ca\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903167 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903224 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ce1c215a-f629-4ec9-ac9b-100699560ce1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k27wn\" (UID: \"ce1c215a-f629-4ec9-ac9b-100699560ce1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903254 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-config\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903474 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0221f5-84f8-47b9-bab3-934c1890fb2f-metrics-certs\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903509 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9jz6\" (UniqueName: \"kubernetes.io/projected/21c3415f-b025-475e-a603-76329416df23-kube-api-access-j9jz6\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903534 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-client-ca\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903766 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903805 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6534f377-ba61-4952-b722-1dcfe94df2ea-serving-cert\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903837 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsrjz\" (UniqueName: \"kubernetes.io/projected/a21d7057-af5a-499a-b71f-3cae88b6883e-kube-api-access-fsrjz\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.903865 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904174 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0b0221f5-84f8-47b9-bab3-934c1890fb2f-default-certificate\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904201 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bjcb\" (UniqueName: \"kubernetes.io/projected/ce1c215a-f629-4ec9-ac9b-100699560ce1-kube-api-access-2bjcb\") pod \"olm-operator-6b444d44fb-k27wn\" (UID: \"ce1c215a-f629-4ec9-ac9b-100699560ce1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904225 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-etcd-client\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904489 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211b0d80-1cec-40f5-8679-91466edeb1a1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-622vk\" (UID: \"211b0d80-1cec-40f5-8679-91466edeb1a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904518 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6534f377-ba61-4952-b722-1dcfe94df2ea-etcd-client\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904544 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-metrics-tls\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904633 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-etcd-serving-ca\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904721 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-oauth-config\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904759 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a7114dc-66fa-4519-a0b1-83e34b999f9b-config\") pod \"kube-controller-manager-operator-78b949d7b-p8nlt\" (UID: \"5a7114dc-66fa-4519-a0b1-83e34b999f9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904815 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4218cb22-8dd6-45fd-8091-635dd8a305bf-config\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.904974 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9hf5p\" (UID: \"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905012 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a89ca6c-6d44-4fa5-b728-09aa58890f99-serving-cert\") pod \"service-ca-operator-777779d784-t7qmg\" (UID: \"6a89ca6c-6d44-4fa5-b728-09aa58890f99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905098 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d528a1a3-29b5-4b67-a9bd-e7102d857426-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s5fmp\" (UID: \"d528a1a3-29b5-4b67-a9bd-e7102d857426\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905286 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211b0d80-1cec-40f5-8679-91466edeb1a1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-622vk\" (UID: \"211b0d80-1cec-40f5-8679-91466edeb1a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905321 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7114dc-66fa-4519-a0b1-83e34b999f9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-p8nlt\" (UID: \"5a7114dc-66fa-4519-a0b1-83e34b999f9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905359 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7gr7\" (UniqueName: \"kubernetes.io/projected/7c273a7a-00f1-4e28-a9a6-69d240f2df29-kube-api-access-x7gr7\") pod \"downloads-7954f5f757-n4g5v\" (UID: \"7c273a7a-00f1-4e28-a9a6-69d240f2df29\") " pod="openshift-console/downloads-7954f5f757-n4g5v" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905457 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44n9b\" (UniqueName: \"kubernetes.io/projected/3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e-kube-api-access-44n9b\") pod \"package-server-manager-789f6589d5-x8c9s\" (UID: \"3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905490 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9e65e64-f0de-46b5-a383-ccb51e989ba1-metrics-tls\") pod \"dns-operator-744455d44c-pnnxx\" (UID: \"e9e65e64-f0de-46b5-a383-ccb51e989ba1\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905515 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wn7k\" (UniqueName: \"kubernetes.io/projected/0b0221f5-84f8-47b9-bab3-934c1890fb2f-kube-api-access-5wn7k\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905539 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b45a827-576d-4fbb-87f4-4ae4d8af634e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2hc2d\" (UID: \"9b45a827-576d-4fbb-87f4-4ae4d8af634e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905651 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp7v8\" (UniqueName: \"kubernetes.io/projected/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-kube-api-access-zp7v8\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905679 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-cabundle\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905683 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-config\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905704 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9hf5p\" (UID: \"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905948 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5dfq\" (UniqueName: \"kubernetes.io/projected/9b45a827-576d-4fbb-87f4-4ae4d8af634e-kube-api-access-c5dfq\") pod \"openshift-controller-manager-operator-756b6f6bc6-2hc2d\" (UID: \"9b45a827-576d-4fbb-87f4-4ae4d8af634e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.905980 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21c3415f-b025-475e-a603-76329416df23-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906009 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fe5d5814-39d5-43a9-9352-f47c80603c75-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d5tc5\" (UID: \"fe5d5814-39d5-43a9-9352-f47c80603c75\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906068 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906110 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4218cb22-8dd6-45fd-8091-635dd8a305bf-service-ca-bundle\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906144 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfccx\" (UniqueName: \"kubernetes.io/projected/60e17360-c395-4a9b-b371-2aeeccf0b70f-kube-api-access-hfccx\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906405 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-node-pullsecrets\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906446 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-apiservice-cert\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906483 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906533 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-6smwd"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906713 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906751 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-encryption-config\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906784 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-audit-dir\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.907047 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-serving-cert\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.907090 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g78nn\" (UniqueName: \"kubernetes.io/projected/82c317fb-5d87-47b3-849c-58b0bab4d3ef-kube-api-access-g78nn\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.907125 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0568b2dd-f5ca-44ae-992d-1a5ed2e998fb-config\") pod \"kube-apiserver-operator-766d6c64bb-687pt\" (UID: \"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.906945 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d528a1a3-29b5-4b67-a9bd-e7102d857426-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s5fmp\" (UID: \"d528a1a3-29b5-4b67-a9bd-e7102d857426\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.907360 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwdv4\" (UniqueName: \"kubernetes.io/projected/6534f377-ba61-4952-b722-1dcfe94df2ea-kube-api-access-qwdv4\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.907411 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-tmpfs\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.907446 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b0221f5-84f8-47b9-bab3-934c1890fb2f-service-ca-bundle\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.907473 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.907728 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.907759 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-image-import-ca\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.907788 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0b0221f5-84f8-47b9-bab3-934c1890fb2f-stats-auth\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908122 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-trusted-ca-bundle\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908160 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-oauth-serving-cert\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908190 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k9kf\" (UniqueName: \"kubernetes.io/projected/d528a1a3-29b5-4b67-a9bd-e7102d857426-kube-api-access-9k9kf\") pod \"openshift-apiserver-operator-796bbdcf4f-s5fmp\" (UID: \"d528a1a3-29b5-4b67-a9bd-e7102d857426\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908302 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4218cb22-8dd6-45fd-8091-635dd8a305bf-serving-cert\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908329 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69nsw\" (UniqueName: \"kubernetes.io/projected/6a89ca6c-6d44-4fa5-b728-09aa58890f99-kube-api-access-69nsw\") pod \"service-ca-operator-777779d784-t7qmg\" (UID: \"6a89ca6c-6d44-4fa5-b728-09aa58890f99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908384 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a21d7057-af5a-499a-b71f-3cae88b6883e-serving-cert\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908415 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfhhr\" (UniqueName: \"kubernetes.io/projected/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-kube-api-access-qfhhr\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908444 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9hf5p\" (UID: \"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908490 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6534f377-ba61-4952-b722-1dcfe94df2ea-etcd-ca\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908513 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-config\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908554 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b45a827-576d-4fbb-87f4-4ae4d8af634e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2hc2d\" (UID: \"9b45a827-576d-4fbb-87f4-4ae4d8af634e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908579 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0568b2dd-f5ca-44ae-992d-1a5ed2e998fb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-687pt\" (UID: \"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908606 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4218cb22-8dd6-45fd-8091-635dd8a305bf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908767 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe5d5814-39d5-43a9-9352-f47c80603c75-serving-cert\") pod \"openshift-config-operator-7777fb866f-d5tc5\" (UID: \"fe5d5814-39d5-43a9-9352-f47c80603c75\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908798 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21c3415f-b025-475e-a603-76329416df23-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.908826 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-key\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.909465 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-auth-proxy-config\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.909518 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/21c3415f-b025-475e-a603-76329416df23-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.909556 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg8dw\" (UniqueName: \"kubernetes.io/projected/fe5d5814-39d5-43a9-9352-f47c80603c75-kube-api-access-wg8dw\") pod \"openshift-config-operator-7777fb866f-d5tc5\" (UID: \"fe5d5814-39d5-43a9-9352-f47c80603c75\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.909704 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-dir\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.909732 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6534f377-ba61-4952-b722-1dcfe94df2ea-etcd-service-ca\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.909768 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-webhook-cert\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.909902 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e7c0553-965b-49eb-8bcd-ad7e2096373c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mmtf2\" (UID: \"1e7c0553-965b-49eb-8bcd-ad7e2096373c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.909937 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7csq\" (UniqueName: \"kubernetes.io/projected/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-kube-api-access-m7csq\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.909995 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhmqw\" (UniqueName: \"kubernetes.io/projected/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-kube-api-access-zhmqw\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.910105 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.910117 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4218cb22-8dd6-45fd-8091-635dd8a305bf-config\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.910139 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.910671 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.910761 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fe5d5814-39d5-43a9-9352-f47c80603c75-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d5tc5\" (UID: \"fe5d5814-39d5-43a9-9352-f47c80603c75\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.910314 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x8c9s\" (UID: \"3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.911057 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k266g\" (UniqueName: \"kubernetes.io/projected/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-kube-api-access-k266g\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.911104 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqs4b\" (UniqueName: \"kubernetes.io/projected/1e7c0553-965b-49eb-8bcd-ad7e2096373c-kube-api-access-wqs4b\") pod \"cluster-samples-operator-665b6dd947-mmtf2\" (UID: \"1e7c0553-965b-49eb-8bcd-ad7e2096373c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.911412 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.911901 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2q9m5"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.912108 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.912133 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4218cb22-8dd6-45fd-8091-635dd8a305bf-service-ca-bundle\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.912328 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-node-pullsecrets\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.912643 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d528a1a3-29b5-4b67-a9bd-e7102d857426-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s5fmp\" (UID: \"d528a1a3-29b5-4b67-a9bd-e7102d857426\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.912892 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.912952 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpf2g\" (UniqueName: \"kubernetes.io/projected/e9e65e64-f0de-46b5-a383-ccb51e989ba1-kube-api-access-qpf2g\") pod \"dns-operator-744455d44c-pnnxx\" (UID: \"e9e65e64-f0de-46b5-a383-ccb51e989ba1\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.913196 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-config\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.913421 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-audit-dir\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.913947 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.914277 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-machine-approver-tls\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.915094 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-auth-proxy-config\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.915826 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-encryption-config\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.915665 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-audit\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.915949 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-trusted-ca\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.916007 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6534f377-ba61-4952-b722-1dcfe94df2ea-config\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.916386 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-config\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.916440 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-serving-cert\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.918719 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-image-import-ca\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.918739 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe5d5814-39d5-43a9-9352-f47c80603c75-serving-cert\") pod \"openshift-config-operator-7777fb866f-d5tc5\" (UID: \"fe5d5814-39d5-43a9-9352-f47c80603c75\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.917842 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-config\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.920610 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-audit\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.921936 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-serving-cert\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.922190 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-machine-approver-tls\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.923975 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-etcd-client\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.924303 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-client-ca\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.924439 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z2q9\" (UniqueName: \"kubernetes.io/projected/211b0d80-1cec-40f5-8679-91466edeb1a1-kube-api-access-5z2q9\") pod \"kube-storage-version-migrator-operator-b67b599dd-622vk\" (UID: \"211b0d80-1cec-40f5-8679-91466edeb1a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.924586 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a21d7057-af5a-499a-b71f-3cae88b6883e-serving-cert\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.925332 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-policies\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.925873 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-service-ca\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.925991 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wxzz\" (UniqueName: \"kubernetes.io/projected/4218cb22-8dd6-45fd-8091-635dd8a305bf-kube-api-access-2wxzz\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.926171 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ce1c215a-f629-4ec9-ac9b-100699560ce1-srv-cert\") pod \"olm-operator-6b444d44fb-k27wn\" (UID: \"ce1c215a-f629-4ec9-ac9b-100699560ce1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.926296 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0568b2dd-f5ca-44ae-992d-1a5ed2e998fb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-687pt\" (UID: \"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.927074 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.927635 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.929008 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4218cb22-8dd6-45fd-8091-635dd8a305bf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.929440 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pnnxx"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.929934 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4218cb22-8dd6-45fd-8091-635dd8a305bf-serving-cert\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.931146 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-h4m4s"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.933335 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.934170 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.936152 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.937161 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.938221 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.939281 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.940388 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-n4g5v"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.940430 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.941379 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lw2n7"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.942811 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbxgq"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.943840 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5v8xb"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.944543 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.945631 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.947973 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.948015 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5pjw9"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.950183 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vc9fr"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.950214 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.955775 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.957371 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4zl69"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.958866 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4zl69"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.958900 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.959881 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pjn9j"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.961245 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pjn9j" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.965730 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pjn9j"] Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.966075 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 08:25:13 crc kubenswrapper[4903]: I0320 08:25:13.982149 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.000123 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.020558 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027540 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfhhr\" (UniqueName: \"kubernetes.io/projected/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-kube-api-access-qfhhr\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027595 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9hf5p\" (UID: \"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027632 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6534f377-ba61-4952-b722-1dcfe94df2ea-etcd-ca\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027662 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b45a827-576d-4fbb-87f4-4ae4d8af634e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2hc2d\" (UID: \"9b45a827-576d-4fbb-87f4-4ae4d8af634e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027694 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0568b2dd-f5ca-44ae-992d-1a5ed2e998fb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-687pt\" (UID: \"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027718 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21c3415f-b025-475e-a603-76329416df23-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027743 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-key\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027788 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/21c3415f-b025-475e-a603-76329416df23-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027846 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-dir\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027869 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e7c0553-965b-49eb-8bcd-ad7e2096373c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mmtf2\" (UID: \"1e7c0553-965b-49eb-8bcd-ad7e2096373c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027895 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6534f377-ba61-4952-b722-1dcfe94df2ea-etcd-service-ca\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027907 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-dir\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027924 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-webhook-cert\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027956 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7csq\" (UniqueName: \"kubernetes.io/projected/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-kube-api-access-m7csq\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.027988 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x8c9s\" (UID: \"3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028014 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhmqw\" (UniqueName: \"kubernetes.io/projected/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-kube-api-access-zhmqw\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028057 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028086 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028125 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqs4b\" (UniqueName: \"kubernetes.io/projected/1e7c0553-965b-49eb-8bcd-ad7e2096373c-kube-api-access-wqs4b\") pod \"cluster-samples-operator-665b6dd947-mmtf2\" (UID: \"1e7c0553-965b-49eb-8bcd-ad7e2096373c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028162 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpf2g\" (UniqueName: \"kubernetes.io/projected/e9e65e64-f0de-46b5-a383-ccb51e989ba1-kube-api-access-qpf2g\") pod \"dns-operator-744455d44c-pnnxx\" (UID: \"e9e65e64-f0de-46b5-a383-ccb51e989ba1\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028189 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028258 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028294 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-trusted-ca\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028322 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6534f377-ba61-4952-b722-1dcfe94df2ea-config\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028351 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-config\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028382 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z2q9\" (UniqueName: \"kubernetes.io/projected/211b0d80-1cec-40f5-8679-91466edeb1a1-kube-api-access-5z2q9\") pod \"kube-storage-version-migrator-operator-b67b599dd-622vk\" (UID: \"211b0d80-1cec-40f5-8679-91466edeb1a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028410 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-policies\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028439 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-service-ca\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028457 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6534f377-ba61-4952-b722-1dcfe94df2ea-etcd-service-ca\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028467 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ce1c215a-f629-4ec9-ac9b-100699560ce1-srv-cert\") pod \"olm-operator-6b444d44fb-k27wn\" (UID: \"ce1c215a-f629-4ec9-ac9b-100699560ce1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028499 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6534f377-ba61-4952-b722-1dcfe94df2ea-etcd-ca\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028508 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0568b2dd-f5ca-44ae-992d-1a5ed2e998fb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-687pt\" (UID: \"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028541 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a7114dc-66fa-4519-a0b1-83e34b999f9b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-p8nlt\" (UID: \"5a7114dc-66fa-4519-a0b1-83e34b999f9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028569 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a89ca6c-6d44-4fa5-b728-09aa58890f99-config\") pod \"service-ca-operator-777779d784-t7qmg\" (UID: \"6a89ca6c-6d44-4fa5-b728-09aa58890f99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028600 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028630 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ce1c215a-f629-4ec9-ac9b-100699560ce1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k27wn\" (UID: \"ce1c215a-f629-4ec9-ac9b-100699560ce1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028698 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0221f5-84f8-47b9-bab3-934c1890fb2f-metrics-certs\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028729 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9jz6\" (UniqueName: \"kubernetes.io/projected/21c3415f-b025-475e-a603-76329416df23-kube-api-access-j9jz6\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028787 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028827 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6534f377-ba61-4952-b722-1dcfe94df2ea-serving-cert\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028854 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028911 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bjcb\" (UniqueName: \"kubernetes.io/projected/ce1c215a-f629-4ec9-ac9b-100699560ce1-kube-api-access-2bjcb\") pod \"olm-operator-6b444d44fb-k27wn\" (UID: \"ce1c215a-f629-4ec9-ac9b-100699560ce1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028944 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0b0221f5-84f8-47b9-bab3-934c1890fb2f-default-certificate\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028968 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.028973 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-metrics-tls\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.029062 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211b0d80-1cec-40f5-8679-91466edeb1a1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-622vk\" (UID: \"211b0d80-1cec-40f5-8679-91466edeb1a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.029828 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030315 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6534f377-ba61-4952-b722-1dcfe94df2ea-config\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030439 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-policies\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030499 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6534f377-ba61-4952-b722-1dcfe94df2ea-etcd-client\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030526 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-config\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030558 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-oauth-config\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030593 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a7114dc-66fa-4519-a0b1-83e34b999f9b-config\") pod \"kube-controller-manager-operator-78b949d7b-p8nlt\" (UID: \"5a7114dc-66fa-4519-a0b1-83e34b999f9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030626 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9hf5p\" (UID: \"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030655 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a89ca6c-6d44-4fa5-b728-09aa58890f99-serving-cert\") pod \"service-ca-operator-777779d784-t7qmg\" (UID: \"6a89ca6c-6d44-4fa5-b728-09aa58890f99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030721 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-service-ca\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030722 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211b0d80-1cec-40f5-8679-91466edeb1a1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-622vk\" (UID: \"211b0d80-1cec-40f5-8679-91466edeb1a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030773 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7114dc-66fa-4519-a0b1-83e34b999f9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-p8nlt\" (UID: \"5a7114dc-66fa-4519-a0b1-83e34b999f9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030787 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e7c0553-965b-49eb-8bcd-ad7e2096373c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mmtf2\" (UID: \"1e7c0553-965b-49eb-8bcd-ad7e2096373c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030804 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44n9b\" (UniqueName: \"kubernetes.io/projected/3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e-kube-api-access-44n9b\") pod \"package-server-manager-789f6589d5-x8c9s\" (UID: \"3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030852 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9e65e64-f0de-46b5-a383-ccb51e989ba1-metrics-tls\") pod \"dns-operator-744455d44c-pnnxx\" (UID: \"e9e65e64-f0de-46b5-a383-ccb51e989ba1\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030892 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7gr7\" (UniqueName: \"kubernetes.io/projected/7c273a7a-00f1-4e28-a9a6-69d240f2df29-kube-api-access-x7gr7\") pod \"downloads-7954f5f757-n4g5v\" (UID: \"7c273a7a-00f1-4e28-a9a6-69d240f2df29\") " pod="openshift-console/downloads-7954f5f757-n4g5v" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030919 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wn7k\" (UniqueName: \"kubernetes.io/projected/0b0221f5-84f8-47b9-bab3-934c1890fb2f-kube-api-access-5wn7k\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030943 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b45a827-576d-4fbb-87f4-4ae4d8af634e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2hc2d\" (UID: \"9b45a827-576d-4fbb-87f4-4ae4d8af634e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030966 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-cabundle\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.030999 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9hf5p\" (UID: \"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031018 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dfq\" (UniqueName: \"kubernetes.io/projected/9b45a827-576d-4fbb-87f4-4ae4d8af634e-kube-api-access-c5dfq\") pod \"openshift-controller-manager-operator-756b6f6bc6-2hc2d\" (UID: \"9b45a827-576d-4fbb-87f4-4ae4d8af634e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031089 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21c3415f-b025-475e-a603-76329416df23-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031114 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031178 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfccx\" (UniqueName: \"kubernetes.io/projected/60e17360-c395-4a9b-b371-2aeeccf0b70f-kube-api-access-hfccx\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031199 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-apiservice-cert\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031219 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031243 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031262 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-serving-cert\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031279 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g78nn\" (UniqueName: \"kubernetes.io/projected/82c317fb-5d87-47b3-849c-58b0bab4d3ef-kube-api-access-g78nn\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031299 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0568b2dd-f5ca-44ae-992d-1a5ed2e998fb-config\") pod \"kube-apiserver-operator-766d6c64bb-687pt\" (UID: \"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031319 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwdv4\" (UniqueName: \"kubernetes.io/projected/6534f377-ba61-4952-b722-1dcfe94df2ea-kube-api-access-qwdv4\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031337 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b0221f5-84f8-47b9-bab3-934c1890fb2f-service-ca-bundle\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031353 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-tmpfs\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031373 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031406 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031438 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0b0221f5-84f8-47b9-bab3-934c1890fb2f-stats-auth\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031459 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-trusted-ca-bundle\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031475 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-oauth-serving-cert\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031496 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69nsw\" (UniqueName: \"kubernetes.io/projected/6a89ca6c-6d44-4fa5-b728-09aa58890f99-kube-api-access-69nsw\") pod \"service-ca-operator-777779d784-t7qmg\" (UID: \"6a89ca6c-6d44-4fa5-b728-09aa58890f99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.031970 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b45a827-576d-4fbb-87f4-4ae4d8af634e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2hc2d\" (UID: \"9b45a827-576d-4fbb-87f4-4ae4d8af634e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.032276 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.032594 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.032602 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b45a827-576d-4fbb-87f4-4ae4d8af634e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2hc2d\" (UID: \"9b45a827-576d-4fbb-87f4-4ae4d8af634e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.032888 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6534f377-ba61-4952-b722-1dcfe94df2ea-serving-cert\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.032921 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6534f377-ba61-4952-b722-1dcfe94df2ea-etcd-client\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.033096 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21c3415f-b025-475e-a603-76329416df23-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.033720 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9e65e64-f0de-46b5-a383-ccb51e989ba1-metrics-tls\") pod \"dns-operator-744455d44c-pnnxx\" (UID: \"e9e65e64-f0de-46b5-a383-ccb51e989ba1\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.033943 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-tmpfs\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.034141 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-trusted-ca-bundle\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.035333 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-oauth-serving-cert\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.035346 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-oauth-config\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.035607 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.036101 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.036127 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.036494 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.036809 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-serving-cert\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.039536 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.040221 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.040735 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.046386 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.051156 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0568b2dd-f5ca-44ae-992d-1a5ed2e998fb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-687pt\" (UID: \"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.061909 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.064236 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0568b2dd-f5ca-44ae-992d-1a5ed2e998fb-config\") pod \"kube-apiserver-operator-766d6c64bb-687pt\" (UID: \"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.090297 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.091297 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-trusted-ca\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.101062 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.127096 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.141941 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-metrics-tls\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.143439 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.176628 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x26p9\" (UniqueName: \"kubernetes.io/projected/572cb149-e6e4-4d1b-ab27-145239b82d1c-kube-api-access-x26p9\") pod \"machine-api-operator-5694c8668f-mx9nb\" (UID: \"572cb149-e6e4-4d1b-ab27-145239b82d1c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.180596 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.200533 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.212634 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/21c3415f-b025-475e-a603-76329416df23-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.220058 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.240860 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.250752 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9hf5p\" (UID: \"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.260910 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.280971 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.282523 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9hf5p\" (UID: \"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.298804 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.321551 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsbhv\" (UniqueName: \"kubernetes.io/projected/24298acb-4912-45c7-b28d-f8f7389bb7e6-kube-api-access-vsbhv\") pod \"apiserver-7bbb656c7d-qgf92\" (UID: \"24298acb-4912-45c7-b28d-f8f7389bb7e6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.337261 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pm7f\" (UniqueName: \"kubernetes.io/projected/a01da421-9bf1-459f-a419-c7cc271bf472-kube-api-access-2pm7f\") pod \"controller-manager-879f6c89f-f47pf\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.340802 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.360558 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.380710 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.386411 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a7114dc-66fa-4519-a0b1-83e34b999f9b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-p8nlt\" (UID: \"5a7114dc-66fa-4519-a0b1-83e34b999f9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.400835 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.402816 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a7114dc-66fa-4519-a0b1-83e34b999f9b-config\") pod \"kube-controller-manager-operator-78b949d7b-p8nlt\" (UID: \"5a7114dc-66fa-4519-a0b1-83e34b999f9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.421577 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.441564 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.455484 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ce1c215a-f629-4ec9-ac9b-100699560ce1-srv-cert\") pod \"olm-operator-6b444d44fb-k27wn\" (UID: \"ce1c215a-f629-4ec9-ac9b-100699560ce1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.461674 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.480948 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.495175 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b0221f5-84f8-47b9-bab3-934c1890fb2f-metrics-certs\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.500634 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mx9nb"] Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.501246 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: W0320 08:25:14.510293 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod572cb149_e6e4_4d1b_ab27_145239b82d1c.slice/crio-8f4eb55d6bc4ecb8762366fb5f8d20f01be74bae9c2c85d3c1b827a3a0ec222d WatchSource:0}: Error finding container 8f4eb55d6bc4ecb8762366fb5f8d20f01be74bae9c2c85d3c1b827a3a0ec222d: Status 404 returned error can't find the container with id 8f4eb55d6bc4ecb8762366fb5f8d20f01be74bae9c2c85d3c1b827a3a0ec222d Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.520994 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.529738 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b0221f5-84f8-47b9-bab3-934c1890fb2f-service-ca-bundle\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.542450 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.560845 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.581172 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.587222 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.591885 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211b0d80-1cec-40f5-8679-91466edeb1a1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-622vk\" (UID: \"211b0d80-1cec-40f5-8679-91466edeb1a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.601325 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.611833 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.621672 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.648954 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.662756 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.676501 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0b0221f5-84f8-47b9-bab3-934c1890fb2f-default-certificate\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.686814 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.700744 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.705605 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0b0221f5-84f8-47b9-bab3-934c1890fb2f-stats-auth\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.723987 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.737128 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/211b0d80-1cec-40f5-8679-91466edeb1a1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-622vk\" (UID: \"211b0d80-1cec-40f5-8679-91466edeb1a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.740977 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.761347 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.767683 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x8c9s\" (UID: \"3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.796898 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92"] Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.801180 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.825469 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.839941 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f47pf"] Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.841692 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 08:25:14 crc kubenswrapper[4903]: W0320 08:25:14.851350 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda01da421_9bf1_459f_a419_c7cc271bf472.slice/crio-0e76442bc0ec992a73f79f3e063551920fe36cfffd71e68382d99133af69f47a WatchSource:0}: Error finding container 0e76442bc0ec992a73f79f3e063551920fe36cfffd71e68382d99133af69f47a: Status 404 returned error can't find the container with id 0e76442bc0ec992a73f79f3e063551920fe36cfffd71e68382d99133af69f47a Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.858227 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a89ca6c-6d44-4fa5-b728-09aa58890f99-serving-cert\") pod \"service-ca-operator-777779d784-t7qmg\" (UID: \"6a89ca6c-6d44-4fa5-b728-09aa58890f99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.860583 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.879297 4903 request.go:700] Waited for 1.015632408s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dservice-ca-operator-config&limit=500&resourceVersion=0 Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.880692 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.890637 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a89ca6c-6d44-4fa5-b728-09aa58890f99-config\") pod \"service-ca-operator-777779d784-t7qmg\" (UID: \"6a89ca6c-6d44-4fa5-b728-09aa58890f99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.900134 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.913218 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ce1c215a-f629-4ec9-ac9b-100699560ce1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k27wn\" (UID: \"ce1c215a-f629-4ec9-ac9b-100699560ce1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.921809 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.930701 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-apiservice-cert\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.932585 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-webhook-cert\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.961641 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 08:25:14 crc kubenswrapper[4903]: I0320 08:25:14.988821 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.000925 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.023207 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 08:25:15 crc kubenswrapper[4903]: E0320 08:25:15.028454 4903 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 20 08:25:15 crc kubenswrapper[4903]: E0320 08:25:15.028560 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-key podName:60e17360-c395-4a9b-b371-2aeeccf0b70f nodeName:}" failed. No retries permitted until 2026-03-20 08:25:15.528534961 +0000 UTC m=+140.745435286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-key") pod "service-ca-9c57cc56f-5v8xb" (UID: "60e17360-c395-4a9b-b371-2aeeccf0b70f") : failed to sync secret cache: timed out waiting for the condition Mar 20 08:25:15 crc kubenswrapper[4903]: E0320 08:25:15.031992 4903 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 08:25:15 crc kubenswrapper[4903]: E0320 08:25:15.032155 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-cabundle podName:60e17360-c395-4a9b-b371-2aeeccf0b70f nodeName:}" failed. No retries permitted until 2026-03-20 08:25:15.532112652 +0000 UTC m=+140.749013167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-cabundle") pod "service-ca-9c57cc56f-5v8xb" (UID: "60e17360-c395-4a9b-b371-2aeeccf0b70f") : failed to sync configmap cache: timed out waiting for the condition Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.040424 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.060844 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.081828 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.101470 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.122524 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.141841 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.160105 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.180211 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.200347 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.221736 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.241202 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.261024 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.277475 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" event={"ID":"572cb149-e6e4-4d1b-ab27-145239b82d1c","Type":"ContainerStarted","Data":"ab50c5c5a6a03f0d43527910761ad28374e9a7146560775d5e1c0ff506921d48"} Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.277914 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" event={"ID":"572cb149-e6e4-4d1b-ab27-145239b82d1c","Type":"ContainerStarted","Data":"b389fb6fb25ed15a01035c248c68621e51a6cf7d86a62ff599ed0ed703dc5a7f"} Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.278002 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" event={"ID":"572cb149-e6e4-4d1b-ab27-145239b82d1c","Type":"ContainerStarted","Data":"8f4eb55d6bc4ecb8762366fb5f8d20f01be74bae9c2c85d3c1b827a3a0ec222d"} Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.279614 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" event={"ID":"a01da421-9bf1-459f-a419-c7cc271bf472","Type":"ContainerStarted","Data":"eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d"} Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.279675 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" event={"ID":"a01da421-9bf1-459f-a419-c7cc271bf472","Type":"ContainerStarted","Data":"0e76442bc0ec992a73f79f3e063551920fe36cfffd71e68382d99133af69f47a"} Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.280581 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.281133 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.282741 4903 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-f47pf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.282788 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" podUID="a01da421-9bf1-459f-a419-c7cc271bf472" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.283166 4903 generic.go:334] "Generic (PLEG): container finished" podID="24298acb-4912-45c7-b28d-f8f7389bb7e6" containerID="e1a7543926cd96b3cbbbe206eb93dbdb22fe40cc3642749bd1df0813c56005c6" exitCode=0 Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.283230 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" event={"ID":"24298acb-4912-45c7-b28d-f8f7389bb7e6","Type":"ContainerDied","Data":"e1a7543926cd96b3cbbbe206eb93dbdb22fe40cc3642749bd1df0813c56005c6"} Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.283270 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" event={"ID":"24298acb-4912-45c7-b28d-f8f7389bb7e6","Type":"ContainerStarted","Data":"63652110e27aed8e8b13469a927e60c2209df1bc5e6fef6031007378db18958e"} Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.301123 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.322633 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.341448 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.361629 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.380821 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.400922 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.421426 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.441614 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.461804 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.480840 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.491278 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:25:15 crc kubenswrapper[4903]: E0320 08:25:15.491487 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.519636 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.521970 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.540416 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.559468 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-key\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.560520 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-cabundle\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.560606 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.561605 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-cabundle\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.563485 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/60e17360-c395-4a9b-b371-2aeeccf0b70f-signing-key\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.581061 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.601311 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.621379 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.641866 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.661771 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.685083 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.726636 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsrjz\" (UniqueName: \"kubernetes.io/projected/a21d7057-af5a-499a-b71f-3cae88b6883e-kube-api-access-fsrjz\") pod \"route-controller-manager-6576b87f9c-6bs4d\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.736760 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp7v8\" (UniqueName: \"kubernetes.io/projected/548a9dd8-86b6-4fca-b0f8-29ac9a0edb72-kube-api-access-zp7v8\") pod \"machine-approver-56656f9798-xzjq9\" (UID: \"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.760585 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg8dw\" (UniqueName: \"kubernetes.io/projected/fe5d5814-39d5-43a9-9352-f47c80603c75-kube-api-access-wg8dw\") pod \"openshift-config-operator-7777fb866f-d5tc5\" (UID: \"fe5d5814-39d5-43a9-9352-f47c80603c75\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.775957 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k9kf\" (UniqueName: \"kubernetes.io/projected/d528a1a3-29b5-4b67-a9bd-e7102d857426-kube-api-access-9k9kf\") pod \"openshift-apiserver-operator-796bbdcf4f-s5fmp\" (UID: \"d528a1a3-29b5-4b67-a9bd-e7102d857426\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.796124 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k266g\" (UniqueName: \"kubernetes.io/projected/f20fdd99-f4e0-4435-98ef-7140cc1c9ca4-kube-api-access-k266g\") pod \"apiserver-76f77b778f-fcqtt\" (UID: \"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4\") " pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.819704 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wxzz\" (UniqueName: \"kubernetes.io/projected/4218cb22-8dd6-45fd-8091-635dd8a305bf-kube-api-access-2wxzz\") pod \"authentication-operator-69f744f599-qq7hw\" (UID: \"4218cb22-8dd6-45fd-8091-635dd8a305bf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.820387 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.841215 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.858365 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.861411 4903 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.870962 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.879339 4903 request.go:700] Waited for 1.917605365s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.881615 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.882619 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.896006 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.901527 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.924095 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.942487 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.943810 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 08:25:15 crc kubenswrapper[4903]: I0320 08:25:15.994208 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.003763 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfhhr\" (UniqueName: \"kubernetes.io/projected/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-kube-api-access-qfhhr\") pod \"console-f9d7485db-h4m4s\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.018890 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7csq\" (UniqueName: \"kubernetes.io/projected/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-kube-api-access-m7csq\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.023143 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21c3415f-b025-475e-a603-76329416df23-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.030889 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.036098 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhmqw\" (UniqueName: \"kubernetes.io/projected/96e5b0d7-c2bd-4df6-8b65-81c0ddf71995-kube-api-access-zhmqw\") pod \"packageserver-d55dfcdfc-fhkp9\" (UID: \"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.060129 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqs4b\" (UniqueName: \"kubernetes.io/projected/1e7c0553-965b-49eb-8bcd-ad7e2096373c-kube-api-access-wqs4b\") pod \"cluster-samples-operator-665b6dd947-mmtf2\" (UID: \"1e7c0553-965b-49eb-8bcd-ad7e2096373c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.079771 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpf2g\" (UniqueName: \"kubernetes.io/projected/e9e65e64-f0de-46b5-a383-ccb51e989ba1-kube-api-access-qpf2g\") pod \"dns-operator-744455d44c-pnnxx\" (UID: \"e9e65e64-f0de-46b5-a383-ccb51e989ba1\") " pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.103752 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a7114dc-66fa-4519-a0b1-83e34b999f9b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-p8nlt\" (UID: \"5a7114dc-66fa-4519-a0b1-83e34b999f9b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.111121 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.120232 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z2q9\" (UniqueName: \"kubernetes.io/projected/211b0d80-1cec-40f5-8679-91466edeb1a1-kube-api-access-5z2q9\") pod \"kube-storage-version-migrator-operator-b67b599dd-622vk\" (UID: \"211b0d80-1cec-40f5-8679-91466edeb1a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.139971 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bjcb\" (UniqueName: \"kubernetes.io/projected/ce1c215a-f629-4ec9-ac9b-100699560ce1-kube-api-access-2bjcb\") pod \"olm-operator-6b444d44fb-k27wn\" (UID: \"ce1c215a-f629-4ec9-ac9b-100699560ce1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.145597 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fcqtt"] Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.164053 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44n9b\" (UniqueName: \"kubernetes.io/projected/3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e-kube-api-access-44n9b\") pod \"package-server-manager-789f6589d5-x8c9s\" (UID: \"3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.168292 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:16 crc kubenswrapper[4903]: W0320 08:25:16.180362 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf20fdd99_f4e0_4435_98ef_7140cc1c9ca4.slice/crio-6d1618740b7dc4a7dfd905f2f916866ce4e035abd4f7c63c5a97ab157932b32f WatchSource:0}: Error finding container 6d1618740b7dc4a7dfd905f2f916866ce4e035abd4f7c63c5a97ab157932b32f: Status 404 returned error can't find the container with id 6d1618740b7dc4a7dfd905f2f916866ce4e035abd4f7c63c5a97ab157932b32f Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.186364 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9jz6\" (UniqueName: \"kubernetes.io/projected/21c3415f-b025-475e-a603-76329416df23-kube-api-access-j9jz6\") pod \"cluster-image-registry-operator-dc59b4c8b-7gchs\" (UID: \"21c3415f-b025-475e-a603-76329416df23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.199415 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9hf5p\" (UID: \"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.226327 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wn7k\" (UniqueName: \"kubernetes.io/projected/0b0221f5-84f8-47b9-bab3-934c1890fb2f-kube-api-access-5wn7k\") pod \"router-default-5444994796-cc4f6\" (UID: \"0b0221f5-84f8-47b9-bab3-934c1890fb2f\") " pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.232047 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qq7hw"] Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.248145 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7gr7\" (UniqueName: \"kubernetes.io/projected/7c273a7a-00f1-4e28-a9a6-69d240f2df29-kube-api-access-x7gr7\") pod \"downloads-7954f5f757-n4g5v\" (UID: \"7c273a7a-00f1-4e28-a9a6-69d240f2df29\") " pod="openshift-console/downloads-7954f5f757-n4g5v" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.264523 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69nsw\" (UniqueName: \"kubernetes.io/projected/6a89ca6c-6d44-4fa5-b728-09aa58890f99-kube-api-access-69nsw\") pod \"service-ca-operator-777779d784-t7qmg\" (UID: \"6a89ca6c-6d44-4fa5-b728-09aa58890f99\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.287681 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d"] Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.288502 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5dfq\" (UniqueName: \"kubernetes.io/projected/9b45a827-576d-4fbb-87f4-4ae4d8af634e-kube-api-access-c5dfq\") pod \"openshift-controller-manager-operator-756b6f6bc6-2hc2d\" (UID: \"9b45a827-576d-4fbb-87f4-4ae4d8af634e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.300326 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.305876 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" event={"ID":"4218cb22-8dd6-45fd-8091-635dd8a305bf","Type":"ContainerStarted","Data":"dae2e0379dbbeaf1bbd46869c30326f07eab25ddc67ab389b1a2581f46b754fd"} Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.308732 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" event={"ID":"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4","Type":"ContainerStarted","Data":"6d1618740b7dc4a7dfd905f2f916866ce4e035abd4f7c63c5a97ab157932b32f"} Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.311275 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.314981 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" event={"ID":"24298acb-4912-45c7-b28d-f8f7389bb7e6","Type":"ContainerStarted","Data":"6028dd6e3f86f487841ad532e17393d76b9dbd93ce8b6df52b09e1641688723d"} Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.322460 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0b2ccb33-e3d5-46b8-863a-21ec88f7fb49-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h2jwr\" (UID: \"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.322658 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" event={"ID":"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72","Type":"ContainerStarted","Data":"9c826ef9e8d72fd0171e726c11d4912a29f263b4b818319b924b1370a456b5dc"} Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.331833 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.341359 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwdv4\" (UniqueName: \"kubernetes.io/projected/6534f377-ba61-4952-b722-1dcfe94df2ea-kube-api-access-qwdv4\") pod \"etcd-operator-b45778765-9wjwh\" (UID: \"6534f377-ba61-4952-b722-1dcfe94df2ea\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.343751 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-n4g5v" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.349257 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.361675 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0568b2dd-f5ca-44ae-992d-1a5ed2e998fb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-687pt\" (UID: \"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.363472 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.375335 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g78nn\" (UniqueName: \"kubernetes.io/projected/82c317fb-5d87-47b3-849c-58b0bab4d3ef-kube-api-access-g78nn\") pod \"oauth-openshift-558db77b4-5dpx6\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.380886 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.389637 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.409830 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.410374 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfccx\" (UniqueName: \"kubernetes.io/projected/60e17360-c395-4a9b-b371-2aeeccf0b70f-kube-api-access-hfccx\") pod \"service-ca-9c57cc56f-5v8xb\" (UID: \"60e17360-c395-4a9b-b371-2aeeccf0b70f\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.411082 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.419411 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.424956 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.436881 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.438369 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.449060 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.461576 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.483754 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.483988 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvd5x\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-kube-api-access-zvd5x\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.484053 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9xx7\" (UniqueName: \"kubernetes.io/projected/2c53f142-6367-49fd-8577-6141d9f82f73-kube-api-access-n9xx7\") pod \"catalog-operator-68c6474976-skg8s\" (UID: \"2c53f142-6367-49fd-8577-6141d9f82f73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.484111 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-bound-sa-token\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.484239 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-trusted-ca\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.484273 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2c53f142-6367-49fd-8577-6141d9f82f73-srv-cert\") pod \"catalog-operator-68c6474976-skg8s\" (UID: \"2c53f142-6367-49fd-8577-6141d9f82f73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.484322 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5778224c-9b34-45c0-9812-122b95cef431-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.484352 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.484370 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-registry-tls\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.484387 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2c53f142-6367-49fd-8577-6141d9f82f73-profile-collector-cert\") pod \"catalog-operator-68c6474976-skg8s\" (UID: \"2c53f142-6367-49fd-8577-6141d9f82f73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.485864 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-registry-certificates\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.485917 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5778224c-9b34-45c0-9812-122b95cef431-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: E0320 08:25:16.486109 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:16.986091684 +0000 UTC m=+142.202992069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.590671 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.591020 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:16 crc kubenswrapper[4903]: E0320 08:25:16.590995 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.090969941 +0000 UTC m=+142.307870256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.591458 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9xx7\" (UniqueName: \"kubernetes.io/projected/2c53f142-6367-49fd-8577-6141d9f82f73-kube-api-access-n9xx7\") pod \"catalog-operator-68c6474976-skg8s\" (UID: \"2c53f142-6367-49fd-8577-6141d9f82f73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.591501 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snlc5\" (UniqueName: \"kubernetes.io/projected/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-kube-api-access-snlc5\") pod \"collect-profiles-29566575-t4mvb\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.591569 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zbxgq\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.593933 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ba96cb1-4795-45d3-a320-5ef3b87c644f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594290 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-bound-sa-token\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594313 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a9d3e49c-9f8d-4fb8-a007-61378316daa8-certs\") pod \"machine-config-server-6smwd\" (UID: \"a9d3e49c-9f8d-4fb8-a007-61378316daa8\") " pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594350 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-trusted-ca\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594370 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2rcw\" (UniqueName: \"kubernetes.io/projected/ca2849ec-27bf-41e4-b520-0ffc80e33e99-kube-api-access-x2rcw\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594423 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2c53f142-6367-49fd-8577-6141d9f82f73-srv-cert\") pod \"catalog-operator-68c6474976-skg8s\" (UID: \"2c53f142-6367-49fd-8577-6141d9f82f73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594442 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/433bd27c-a67a-4487-b09e-523fd9b34b8f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594561 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5w54\" (UniqueName: \"kubernetes.io/projected/d7385d29-60c8-459c-8d9f-f87f697b4dcb-kube-api-access-k5w54\") pod \"dns-default-vc9fr\" (UID: \"d7385d29-60c8-459c-8d9f-f87f697b4dcb\") " pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594596 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n52sk\" (UniqueName: \"kubernetes.io/projected/fd2cbd5a-d35b-4911-9748-57911dbb5bbb-kube-api-access-n52sk\") pod \"machine-config-controller-84d6567774-sq6kr\" (UID: \"fd2cbd5a-d35b-4911-9748-57911dbb5bbb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594615 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca2849ec-27bf-41e4-b520-0ffc80e33e99-trusted-ca\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594715 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2z4h\" (UniqueName: \"kubernetes.io/projected/4742c0a9-7786-4b7e-823e-e70630e72495-kube-api-access-w2z4h\") pod \"marketplace-operator-79b997595-zbxgq\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594783 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cftvb\" (UniqueName: \"kubernetes.io/projected/8d670e7f-bbda-4168-87a6-baa6ce35177b-kube-api-access-cftvb\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594812 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5778224c-9b34-45c0-9812-122b95cef431-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594830 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-registration-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594883 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-plugins-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594898 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2849ec-27bf-41e4-b520-0ffc80e33e99-serving-cert\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594954 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594972 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-registry-tls\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.594990 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlflt\" (UniqueName: \"kubernetes.io/projected/433bd27c-a67a-4487-b09e-523fd9b34b8f-kube-api-access-jlflt\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.595022 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-socket-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.595075 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2c53f142-6367-49fd-8577-6141d9f82f73-profile-collector-cert\") pod \"catalog-operator-68c6474976-skg8s\" (UID: \"2c53f142-6367-49fd-8577-6141d9f82f73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.595102 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/433bd27c-a67a-4487-b09e-523fd9b34b8f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.596825 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-trusted-ca\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: E0320 08:25:16.601002 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.100981891 +0000 UTC m=+142.317882206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.601597 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd57p\" (UniqueName: \"kubernetes.io/projected/5bfc26b7-6752-4984-90fa-ed0099d0627c-kube-api-access-rd57p\") pod \"ingress-canary-pjn9j\" (UID: \"5bfc26b7-6752-4984-90fa-ed0099d0627c\") " pod="openshift-ingress-canary/ingress-canary-pjn9j" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.601679 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ba96cb1-4795-45d3-a320-5ef3b87c644f-proxy-tls\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.601731 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd2cbd5a-d35b-4911-9748-57911dbb5bbb-proxy-tls\") pod \"machine-config-controller-84d6567774-sq6kr\" (UID: \"fd2cbd5a-d35b-4911-9748-57911dbb5bbb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.602456 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0ba96cb1-4795-45d3-a320-5ef3b87c644f-images\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.602497 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-mountpoint-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.602517 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9jhv\" (UniqueName: \"kubernetes.io/projected/80577d9f-8d2f-4cd9-8a6b-305ec6e2a612-kube-api-access-x9jhv\") pod \"migrator-59844c95c7-qqtv5\" (UID: \"80577d9f-8d2f-4cd9-8a6b-305ec6e2a612\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.602753 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-registry-certificates\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.602842 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/433bd27c-a67a-4487-b09e-523fd9b34b8f-ready\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.603041 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5778224c-9b34-45c0-9812-122b95cef431-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.603107 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98wq4\" (UniqueName: \"kubernetes.io/projected/c2c16cc7-f269-4f15-b84f-9b46b14cd853-kube-api-access-98wq4\") pod \"multus-admission-controller-857f4d67dd-lw2n7\" (UID: \"c2c16cc7-f269-4f15-b84f-9b46b14cd853\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.603300 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8pqb\" (UniqueName: \"kubernetes.io/projected/a9d3e49c-9f8d-4fb8-a007-61378316daa8-kube-api-access-c8pqb\") pod \"machine-config-server-6smwd\" (UID: \"a9d3e49c-9f8d-4fb8-a007-61378316daa8\") " pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.603460 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5bfc26b7-6752-4984-90fa-ed0099d0627c-cert\") pod \"ingress-canary-pjn9j\" (UID: \"5bfc26b7-6752-4984-90fa-ed0099d0627c\") " pod="openshift-ingress-canary/ingress-canary-pjn9j" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.603489 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c2c16cc7-f269-4f15-b84f-9b46b14cd853-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lw2n7\" (UID: \"c2c16cc7-f269-4f15-b84f-9b46b14cd853\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.603888 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7385d29-60c8-459c-8d9f-f87f697b4dcb-metrics-tls\") pod \"dns-default-vc9fr\" (UID: \"d7385d29-60c8-459c-8d9f-f87f697b4dcb\") " pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.603926 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-registry-certificates\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.604189 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnthm\" (UniqueName: \"kubernetes.io/projected/0ba96cb1-4795-45d3-a320-5ef3b87c644f-kube-api-access-dnthm\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.604231 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5778224c-9b34-45c0-9812-122b95cef431-ca-trust-extracted\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.604308 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5g2z\" (UniqueName: \"kubernetes.io/projected/faa861a0-c46b-46d2-8f2f-6c8e70f403ec-kube-api-access-b5g2z\") pod \"control-plane-machine-set-operator-78cbb6b69f-9zwz2\" (UID: \"faa861a0-c46b-46d2-8f2f-6c8e70f403ec\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.604363 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca2849ec-27bf-41e4-b520-0ffc80e33e99-config\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.604403 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a9d3e49c-9f8d-4fb8-a007-61378316daa8-node-bootstrap-token\") pod \"machine-config-server-6smwd\" (UID: \"a9d3e49c-9f8d-4fb8-a007-61378316daa8\") " pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.604422 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zbxgq\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.604466 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-secret-volume\") pod \"collect-profiles-29566575-t4mvb\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.604487 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fd2cbd5a-d35b-4911-9748-57911dbb5bbb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sq6kr\" (UID: \"fd2cbd5a-d35b-4911-9748-57911dbb5bbb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.606946 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-registry-tls\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.607973 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5778224c-9b34-45c0-9812-122b95cef431-installation-pull-secrets\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.608259 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/faa861a0-c46b-46d2-8f2f-6c8e70f403ec-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9zwz2\" (UID: \"faa861a0-c46b-46d2-8f2f-6c8e70f403ec\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.608310 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvd5x\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-kube-api-access-zvd5x\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.608350 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-config-volume\") pod \"collect-profiles-29566575-t4mvb\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.608371 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7385d29-60c8-459c-8d9f-f87f697b4dcb-config-volume\") pod \"dns-default-vc9fr\" (UID: \"d7385d29-60c8-459c-8d9f-f87f697b4dcb\") " pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.608393 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-csi-data-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.613276 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9xx7\" (UniqueName: \"kubernetes.io/projected/2c53f142-6367-49fd-8577-6141d9f82f73-kube-api-access-n9xx7\") pod \"catalog-operator-68c6474976-skg8s\" (UID: \"2c53f142-6367-49fd-8577-6141d9f82f73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.615890 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2c53f142-6367-49fd-8577-6141d9f82f73-srv-cert\") pod \"catalog-operator-68c6474976-skg8s\" (UID: \"2c53f142-6367-49fd-8577-6141d9f82f73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.629499 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2c53f142-6367-49fd-8577-6141d9f82f73-profile-collector-cert\") pod \"catalog-operator-68c6474976-skg8s\" (UID: \"2c53f142-6367-49fd-8577-6141d9f82f73\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.648817 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-bound-sa-token\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.700630 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp"] Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.708656 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvd5x\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-kube-api-access-zvd5x\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.713821 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:16 crc kubenswrapper[4903]: E0320 08:25:16.713938 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.213914404 +0000 UTC m=+142.430814719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.714190 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-plugins-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.714215 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2849ec-27bf-41e4-b520-0ffc80e33e99-serving-cert\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.714246 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.714268 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlflt\" (UniqueName: \"kubernetes.io/projected/433bd27c-a67a-4487-b09e-523fd9b34b8f-kube-api-access-jlflt\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.714295 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-socket-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.714607 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-plugins-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: E0320 08:25:16.714620 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.214612473 +0000 UTC m=+142.431512788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715145 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/433bd27c-a67a-4487-b09e-523fd9b34b8f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715185 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd57p\" (UniqueName: \"kubernetes.io/projected/5bfc26b7-6752-4984-90fa-ed0099d0627c-kube-api-access-rd57p\") pod \"ingress-canary-pjn9j\" (UID: \"5bfc26b7-6752-4984-90fa-ed0099d0627c\") " pod="openshift-ingress-canary/ingress-canary-pjn9j" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715198 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-socket-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715210 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ba96cb1-4795-45d3-a320-5ef3b87c644f-proxy-tls\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715237 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd2cbd5a-d35b-4911-9748-57911dbb5bbb-proxy-tls\") pod \"machine-config-controller-84d6567774-sq6kr\" (UID: \"fd2cbd5a-d35b-4911-9748-57911dbb5bbb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715259 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0ba96cb1-4795-45d3-a320-5ef3b87c644f-images\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715279 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-mountpoint-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715299 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9jhv\" (UniqueName: \"kubernetes.io/projected/80577d9f-8d2f-4cd9-8a6b-305ec6e2a612-kube-api-access-x9jhv\") pod \"migrator-59844c95c7-qqtv5\" (UID: \"80577d9f-8d2f-4cd9-8a6b-305ec6e2a612\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715320 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/433bd27c-a67a-4487-b09e-523fd9b34b8f-ready\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715340 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98wq4\" (UniqueName: \"kubernetes.io/projected/c2c16cc7-f269-4f15-b84f-9b46b14cd853-kube-api-access-98wq4\") pod \"multus-admission-controller-857f4d67dd-lw2n7\" (UID: \"c2c16cc7-f269-4f15-b84f-9b46b14cd853\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715369 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8pqb\" (UniqueName: \"kubernetes.io/projected/a9d3e49c-9f8d-4fb8-a007-61378316daa8-kube-api-access-c8pqb\") pod \"machine-config-server-6smwd\" (UID: \"a9d3e49c-9f8d-4fb8-a007-61378316daa8\") " pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715391 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c2c16cc7-f269-4f15-b84f-9b46b14cd853-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lw2n7\" (UID: \"c2c16cc7-f269-4f15-b84f-9b46b14cd853\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715412 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5bfc26b7-6752-4984-90fa-ed0099d0627c-cert\") pod \"ingress-canary-pjn9j\" (UID: \"5bfc26b7-6752-4984-90fa-ed0099d0627c\") " pod="openshift-ingress-canary/ingress-canary-pjn9j" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715433 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7385d29-60c8-459c-8d9f-f87f697b4dcb-metrics-tls\") pod \"dns-default-vc9fr\" (UID: \"d7385d29-60c8-459c-8d9f-f87f697b4dcb\") " pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715453 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnthm\" (UniqueName: \"kubernetes.io/projected/0ba96cb1-4795-45d3-a320-5ef3b87c644f-kube-api-access-dnthm\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715474 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5g2z\" (UniqueName: \"kubernetes.io/projected/faa861a0-c46b-46d2-8f2f-6c8e70f403ec-kube-api-access-b5g2z\") pod \"control-plane-machine-set-operator-78cbb6b69f-9zwz2\" (UID: \"faa861a0-c46b-46d2-8f2f-6c8e70f403ec\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715496 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca2849ec-27bf-41e4-b520-0ffc80e33e99-config\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715516 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zbxgq\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715535 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a9d3e49c-9f8d-4fb8-a007-61378316daa8-node-bootstrap-token\") pod \"machine-config-server-6smwd\" (UID: \"a9d3e49c-9f8d-4fb8-a007-61378316daa8\") " pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715555 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fd2cbd5a-d35b-4911-9748-57911dbb5bbb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sq6kr\" (UID: \"fd2cbd5a-d35b-4911-9748-57911dbb5bbb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715583 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-secret-volume\") pod \"collect-profiles-29566575-t4mvb\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715620 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7385d29-60c8-459c-8d9f-f87f697b4dcb-config-volume\") pod \"dns-default-vc9fr\" (UID: \"d7385d29-60c8-459c-8d9f-f87f697b4dcb\") " pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715638 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/faa861a0-c46b-46d2-8f2f-6c8e70f403ec-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9zwz2\" (UID: \"faa861a0-c46b-46d2-8f2f-6c8e70f403ec\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715659 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-config-volume\") pod \"collect-profiles-29566575-t4mvb\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715677 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-csi-data-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715699 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snlc5\" (UniqueName: \"kubernetes.io/projected/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-kube-api-access-snlc5\") pod \"collect-profiles-29566575-t4mvb\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715717 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zbxgq\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715740 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ba96cb1-4795-45d3-a320-5ef3b87c644f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715768 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a9d3e49c-9f8d-4fb8-a007-61378316daa8-certs\") pod \"machine-config-server-6smwd\" (UID: \"a9d3e49c-9f8d-4fb8-a007-61378316daa8\") " pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715787 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2rcw\" (UniqueName: \"kubernetes.io/projected/ca2849ec-27bf-41e4-b520-0ffc80e33e99-kube-api-access-x2rcw\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715805 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/433bd27c-a67a-4487-b09e-523fd9b34b8f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715839 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5w54\" (UniqueName: \"kubernetes.io/projected/d7385d29-60c8-459c-8d9f-f87f697b4dcb-kube-api-access-k5w54\") pod \"dns-default-vc9fr\" (UID: \"d7385d29-60c8-459c-8d9f-f87f697b4dcb\") " pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715858 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n52sk\" (UniqueName: \"kubernetes.io/projected/fd2cbd5a-d35b-4911-9748-57911dbb5bbb-kube-api-access-n52sk\") pod \"machine-config-controller-84d6567774-sq6kr\" (UID: \"fd2cbd5a-d35b-4911-9748-57911dbb5bbb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715858 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/433bd27c-a67a-4487-b09e-523fd9b34b8f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715878 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca2849ec-27bf-41e4-b520-0ffc80e33e99-trusted-ca\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715903 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2z4h\" (UniqueName: \"kubernetes.io/projected/4742c0a9-7786-4b7e-823e-e70630e72495-kube-api-access-w2z4h\") pod \"marketplace-operator-79b997595-zbxgq\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715930 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cftvb\" (UniqueName: \"kubernetes.io/projected/8d670e7f-bbda-4168-87a6-baa6ce35177b-kube-api-access-cftvb\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.715951 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-registration-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.716059 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-registration-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.716942 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ba96cb1-4795-45d3-a320-5ef3b87c644f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.731654 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-mountpoint-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.736741 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zbxgq\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.736893 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fd2cbd5a-d35b-4911-9748-57911dbb5bbb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-sq6kr\" (UID: \"fd2cbd5a-d35b-4911-9748-57911dbb5bbb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.737672 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/433bd27c-a67a-4487-b09e-523fd9b34b8f-ready\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.737807 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8d670e7f-bbda-4168-87a6-baa6ce35177b-csi-data-dir\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.738490 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/433bd27c-a67a-4487-b09e-523fd9b34b8f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.739012 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0ba96cb1-4795-45d3-a320-5ef3b87c644f-images\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.739733 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zbxgq\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.740108 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca2849ec-27bf-41e4-b520-0ffc80e33e99-trusted-ca\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.740817 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-config-volume\") pod \"collect-profiles-29566575-t4mvb\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.743344 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ba96cb1-4795-45d3-a320-5ef3b87c644f-proxy-tls\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.753676 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.760691 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7385d29-60c8-459c-8d9f-f87f697b4dcb-metrics-tls\") pod \"dns-default-vc9fr\" (UID: \"d7385d29-60c8-459c-8d9f-f87f697b4dcb\") " pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.761584 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlflt\" (UniqueName: \"kubernetes.io/projected/433bd27c-a67a-4487-b09e-523fd9b34b8f-kube-api-access-jlflt\") pod \"cni-sysctl-allowlist-ds-spjl5\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.764897 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-secret-volume\") pod \"collect-profiles-29566575-t4mvb\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.772600 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7385d29-60c8-459c-8d9f-f87f697b4dcb-config-volume\") pod \"dns-default-vc9fr\" (UID: \"d7385d29-60c8-459c-8d9f-f87f697b4dcb\") " pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.773674 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2849ec-27bf-41e4-b520-0ffc80e33e99-serving-cert\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.775204 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca2849ec-27bf-41e4-b520-0ffc80e33e99-config\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.795942 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/faa861a0-c46b-46d2-8f2f-6c8e70f403ec-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-9zwz2\" (UID: \"faa861a0-c46b-46d2-8f2f-6c8e70f403ec\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.805886 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5bfc26b7-6752-4984-90fa-ed0099d0627c-cert\") pod \"ingress-canary-pjn9j\" (UID: \"5bfc26b7-6752-4984-90fa-ed0099d0627c\") " pod="openshift-ingress-canary/ingress-canary-pjn9j" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.806130 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n52sk\" (UniqueName: \"kubernetes.io/projected/fd2cbd5a-d35b-4911-9748-57911dbb5bbb-kube-api-access-n52sk\") pod \"machine-config-controller-84d6567774-sq6kr\" (UID: \"fd2cbd5a-d35b-4911-9748-57911dbb5bbb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.816704 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:16 crc kubenswrapper[4903]: E0320 08:25:16.817414 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.317389421 +0000 UTC m=+142.534289746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.842361 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2z4h\" (UniqueName: \"kubernetes.io/projected/4742c0a9-7786-4b7e-823e-e70630e72495-kube-api-access-w2z4h\") pod \"marketplace-operator-79b997595-zbxgq\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.844525 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a9d3e49c-9f8d-4fb8-a007-61378316daa8-certs\") pod \"machine-config-server-6smwd\" (UID: \"a9d3e49c-9f8d-4fb8-a007-61378316daa8\") " pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.844699 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c2c16cc7-f269-4f15-b84f-9b46b14cd853-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lw2n7\" (UID: \"c2c16cc7-f269-4f15-b84f-9b46b14cd853\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.852840 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a9d3e49c-9f8d-4fb8-a007-61378316daa8-node-bootstrap-token\") pod \"machine-config-server-6smwd\" (UID: \"a9d3e49c-9f8d-4fb8-a007-61378316daa8\") " pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.870765 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.872426 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-h4m4s"] Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.874930 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd2cbd5a-d35b-4911-9748-57911dbb5bbb-proxy-tls\") pod \"machine-config-controller-84d6567774-sq6kr\" (UID: \"fd2cbd5a-d35b-4911-9748-57911dbb5bbb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.901869 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cftvb\" (UniqueName: \"kubernetes.io/projected/8d670e7f-bbda-4168-87a6-baa6ce35177b-kube-api-access-cftvb\") pod \"csi-hostpathplugin-4zl69\" (UID: \"8d670e7f-bbda-4168-87a6-baa6ce35177b\") " pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.905912 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98wq4\" (UniqueName: \"kubernetes.io/projected/c2c16cc7-f269-4f15-b84f-9b46b14cd853-kube-api-access-98wq4\") pod \"multus-admission-controller-857f4d67dd-lw2n7\" (UID: \"c2c16cc7-f269-4f15-b84f-9b46b14cd853\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.909975 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9"] Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.917838 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5"] Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.919413 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:16 crc kubenswrapper[4903]: E0320 08:25:16.919943 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.419921152 +0000 UTC m=+142.636821467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.920409 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5w54\" (UniqueName: \"kubernetes.io/projected/d7385d29-60c8-459c-8d9f-f87f697b4dcb-kube-api-access-k5w54\") pod \"dns-default-vc9fr\" (UID: \"d7385d29-60c8-459c-8d9f-f87f697b4dcb\") " pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.932384 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd57p\" (UniqueName: \"kubernetes.io/projected/5bfc26b7-6752-4984-90fa-ed0099d0627c-kube-api-access-rd57p\") pod \"ingress-canary-pjn9j\" (UID: \"5bfc26b7-6752-4984-90fa-ed0099d0627c\") " pod="openshift-ingress-canary/ingress-canary-pjn9j" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.938868 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5g2z\" (UniqueName: \"kubernetes.io/projected/faa861a0-c46b-46d2-8f2f-6c8e70f403ec-kube-api-access-b5g2z\") pod \"control-plane-machine-set-operator-78cbb6b69f-9zwz2\" (UID: \"faa861a0-c46b-46d2-8f2f-6c8e70f403ec\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.942295 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8pqb\" (UniqueName: \"kubernetes.io/projected/a9d3e49c-9f8d-4fb8-a007-61378316daa8-kube-api-access-c8pqb\") pod \"machine-config-server-6smwd\" (UID: \"a9d3e49c-9f8d-4fb8-a007-61378316daa8\") " pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.950133 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt"] Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.973427 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9jhv\" (UniqueName: \"kubernetes.io/projected/80577d9f-8d2f-4cd9-8a6b-305ec6e2a612-kube-api-access-x9jhv\") pod \"migrator-59844c95c7-qqtv5\" (UID: \"80577d9f-8d2f-4cd9-8a6b-305ec6e2a612\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5" Mar 20 08:25:16 crc kubenswrapper[4903]: I0320 08:25:16.991946 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snlc5\" (UniqueName: \"kubernetes.io/projected/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-kube-api-access-snlc5\") pod \"collect-profiles-29566575-t4mvb\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.008583 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2rcw\" (UniqueName: \"kubernetes.io/projected/ca2849ec-27bf-41e4-b520-0ffc80e33e99-kube-api-access-x2rcw\") pod \"console-operator-58897d9998-5pjw9\" (UID: \"ca2849ec-27bf-41e4-b520-0ffc80e33e99\") " pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.021655 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:17 crc kubenswrapper[4903]: E0320 08:25:17.022185 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.522160655 +0000 UTC m=+142.739060970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.077133 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.091677 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.105796 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.112464 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnthm\" (UniqueName: \"kubernetes.io/projected/0ba96cb1-4795-45d3-a320-5ef3b87c644f-kube-api-access-dnthm\") pod \"machine-config-operator-74547568cd-sjg9x\" (UID: \"0ba96cb1-4795-45d3-a320-5ef3b87c644f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.121577 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.124241 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:17 crc kubenswrapper[4903]: E0320 08:25:17.124796 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.624779128 +0000 UTC m=+142.841679443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.140396 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.150890 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.155457 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.162522 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.177233 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6smwd" Mar 20 08:25:17 crc kubenswrapper[4903]: W0320 08:25:17.197353 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe5d5814_39d5_43a9_9352_f47c80603c75.slice/crio-443b027f92dd8ab3f3cefe2ba8784707d08def0390b46ad54e547e4d885af637 WatchSource:0}: Error finding container 443b027f92dd8ab3f3cefe2ba8784707d08def0390b46ad54e547e4d885af637: Status 404 returned error can't find the container with id 443b027f92dd8ab3f3cefe2ba8784707d08def0390b46ad54e547e4d885af637 Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.197424 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4zl69" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.206320 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pjn9j" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.258943 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:17 crc kubenswrapper[4903]: E0320 08:25:17.259308 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.759287734 +0000 UTC m=+142.976188049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.330685 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" event={"ID":"4218cb22-8dd6-45fd-8091-635dd8a305bf","Type":"ContainerStarted","Data":"d14d6b02d3820c8156cd2d7c1f966f5b145638658c04e28c3c00b3d996a352c3"} Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.345785 4903 generic.go:334] "Generic (PLEG): container finished" podID="f20fdd99-f4e0-4435-98ef-7140cc1c9ca4" containerID="9964f1f73aa34437ddbed4fb392ef58c4655d9d6885ee69c9448f23679ada512" exitCode=0 Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.345877 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" event={"ID":"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4","Type":"ContainerDied","Data":"9964f1f73aa34437ddbed4fb392ef58c4655d9d6885ee69c9448f23679ada512"} Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.348643 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" event={"ID":"d528a1a3-29b5-4b67-a9bd-e7102d857426","Type":"ContainerStarted","Data":"3af3b3fb7518adc8be9cb98355c5d8a9a46dea236c49beb327af23d15ab61bcf"} Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.349661 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-cc4f6" event={"ID":"0b0221f5-84f8-47b9-bab3-934c1890fb2f","Type":"ContainerStarted","Data":"b3718dcc2dddd0505c8353d359d56c87c6ccbf2981ff5612da460e1b2d04e8fd"} Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.352335 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" event={"ID":"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72","Type":"ContainerStarted","Data":"8e78cb0510469de0a4496aa7639634f9d9953553dc6cad9917745eceeff05a5c"} Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.356332 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" event={"ID":"5a7114dc-66fa-4519-a0b1-83e34b999f9b","Type":"ContainerStarted","Data":"ce95157f131f95cf7f4ff6c7f1ef6a45ccaa4293e7d825c881397f7722b2643d"} Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.358678 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h4m4s" event={"ID":"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea","Type":"ContainerStarted","Data":"5198347c957344dd43acf7dfce9028028af1666b0835e4447e857ba5daad5ac6"} Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.359977 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" event={"ID":"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995","Type":"ContainerStarted","Data":"683ea695447d8644511187f030c36fbc12732542ae02ce15170dc40f3b8ad04f"} Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.360852 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.361735 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" event={"ID":"fe5d5814-39d5-43a9-9352-f47c80603c75","Type":"ContainerStarted","Data":"443b027f92dd8ab3f3cefe2ba8784707d08def0390b46ad54e547e4d885af637"} Mar 20 08:25:17 crc kubenswrapper[4903]: E0320 08:25:17.363387 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.863364849 +0000 UTC m=+143.080265164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.370601 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" event={"ID":"a21d7057-af5a-499a-b71f-3cae88b6883e","Type":"ContainerStarted","Data":"f3359995e67a9748b6ec381b04f20f3ab3d6e184d8a99cd1d169a4187f66bd79"} Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.406740 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.461730 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:17 crc kubenswrapper[4903]: E0320 08:25:17.462884 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.962851645 +0000 UTC m=+143.179751960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.463143 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:17 crc kubenswrapper[4903]: E0320 08:25:17.463748 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:17.963738289 +0000 UTC m=+143.180638594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.567846 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:17 crc kubenswrapper[4903]: E0320 08:25:17.568465 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:18.068432771 +0000 UTC m=+143.285333086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.672870 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:17 crc kubenswrapper[4903]: E0320 08:25:17.673345 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:18.173329078 +0000 UTC m=+143.390229393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:17 crc kubenswrapper[4903]: W0320 08:25:17.689269 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9d3e49c_9f8d_4fb8_a007_61378316daa8.slice/crio-578e6fe41221a0d6b6f16fd456fa5d2bba5a57191a30d1c204b5b8671fd4e8bf WatchSource:0}: Error finding container 578e6fe41221a0d6b6f16fd456fa5d2bba5a57191a30d1c204b5b8671fd4e8bf: Status 404 returned error can't find the container with id 578e6fe41221a0d6b6f16fd456fa5d2bba5a57191a30d1c204b5b8671fd4e8bf Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.755002 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.754975554 podStartE2EDuration="1.754975554s" podCreationTimestamp="2026-03-20 08:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:17.753025429 +0000 UTC m=+142.969925764" watchObservedRunningTime="2026-03-20 08:25:17.754975554 +0000 UTC m=+142.971875869" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.774497 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:17 crc kubenswrapper[4903]: E0320 08:25:17.775095 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:18.275071747 +0000 UTC m=+143.491972052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.894381 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:17 crc kubenswrapper[4903]: E0320 08:25:17.895297 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:18.395277903 +0000 UTC m=+143.612178218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.900866 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" podStartSLOduration=80.900841579 podStartE2EDuration="1m20.900841579s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:17.880778156 +0000 UTC m=+143.097678471" watchObservedRunningTime="2026-03-20 08:25:17.900841579 +0000 UTC m=+143.117741894" Mar 20 08:25:17 crc kubenswrapper[4903]: I0320 08:25:17.930616 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qq7hw" podStartSLOduration=81.930593342 podStartE2EDuration="1m21.930593342s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:17.928895044 +0000 UTC m=+143.145795359" watchObservedRunningTime="2026-03-20 08:25:17.930593342 +0000 UTC m=+143.147493657" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.006309 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:18 crc kubenswrapper[4903]: E0320 08:25:18.006897 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:18.506873207 +0000 UTC m=+143.723773532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.098949 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mx9nb" podStartSLOduration=81.098928025 podStartE2EDuration="1m21.098928025s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:18.098419531 +0000 UTC m=+143.315319856" watchObservedRunningTime="2026-03-20 08:25:18.098928025 +0000 UTC m=+143.315828340" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.108650 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:18 crc kubenswrapper[4903]: E0320 08:25:18.109923 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:18.609887362 +0000 UTC m=+143.826787677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.209591 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:18 crc kubenswrapper[4903]: E0320 08:25:18.210387 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:18.710350985 +0000 UTC m=+143.927251300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.312229 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:18 crc kubenswrapper[4903]: E0320 08:25:18.312555 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:18.812542507 +0000 UTC m=+144.029442822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.421243 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:18 crc kubenswrapper[4903]: E0320 08:25:18.421659 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:18.921641801 +0000 UTC m=+144.138542116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.422572 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" podStartSLOduration=81.422549256 podStartE2EDuration="1m21.422549256s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:18.420368596 +0000 UTC m=+143.637268911" watchObservedRunningTime="2026-03-20 08:25:18.422549256 +0000 UTC m=+143.639449571" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.427683 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h4m4s" event={"ID":"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea","Type":"ContainerStarted","Data":"2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.446694 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" event={"ID":"96e5b0d7-c2bd-4df6-8b65-81c0ddf71995","Type":"ContainerStarted","Data":"344d11320b0dcf48600bf8dd0d69e1005a520f20dd4d9a1c85263df0504bcc2e"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.447321 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.448132 4903 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fhkp9 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.448166 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" podUID="96e5b0d7-c2bd-4df6-8b65-81c0ddf71995" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.449446 4903 generic.go:334] "Generic (PLEG): container finished" podID="fe5d5814-39d5-43a9-9352-f47c80603c75" containerID="061faa452d915549b71d1126efedbca91cad2adcbefcd36ccc9f2aa96cdd5efc" exitCode=0 Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.449490 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" event={"ID":"fe5d5814-39d5-43a9-9352-f47c80603c75","Type":"ContainerDied","Data":"061faa452d915549b71d1126efedbca91cad2adcbefcd36ccc9f2aa96cdd5efc"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.509153 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" event={"ID":"d528a1a3-29b5-4b67-a9bd-e7102d857426","Type":"ContainerStarted","Data":"ac63957c5efd8378874b2c6e1788befee56e2cc9d9d7355183889511eac51da7"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.523310 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.543137 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" event={"ID":"548a9dd8-86b6-4fca-b0f8-29ac9a0edb72","Type":"ContainerStarted","Data":"677850ad4df508d73491023df5960711f92299e6409305fde1ba8575299786ba"} Mar 20 08:25:18 crc kubenswrapper[4903]: E0320 08:25:18.543477 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:19.043454572 +0000 UTC m=+144.260354887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.549795 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" event={"ID":"5a7114dc-66fa-4519-a0b1-83e34b999f9b","Type":"ContainerStarted","Data":"ad1c7a9ce750917e5f9ce9ecb40610b213df25e4d10e414703265f2b78ea6b62"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.574884 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" event={"ID":"a21d7057-af5a-499a-b71f-3cae88b6883e","Type":"ContainerStarted","Data":"bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.609823 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6smwd" event={"ID":"a9d3e49c-9f8d-4fb8-a007-61378316daa8","Type":"ContainerStarted","Data":"e632ffcd1e1fc584870d861c3e1ea98f70af21998f9e3cd084212d09ffc7c518"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.609909 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6smwd" event={"ID":"a9d3e49c-9f8d-4fb8-a007-61378316daa8","Type":"ContainerStarted","Data":"578e6fe41221a0d6b6f16fd456fa5d2bba5a57191a30d1c204b5b8671fd4e8bf"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.614747 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" event={"ID":"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4","Type":"ContainerStarted","Data":"dc28259bef3fd89713cf68185532209f05aa405dc5189e35c04157408cdecccc"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.627375 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-cc4f6" event={"ID":"0b0221f5-84f8-47b9-bab3-934c1890fb2f","Type":"ContainerStarted","Data":"2be9a1767246c9af1813c5cdad78445fbdb3b51b7cbd0d36812e6c073022458b"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.628252 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:18 crc kubenswrapper[4903]: E0320 08:25:18.630925 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:19.130899461 +0000 UTC m=+144.347799776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.655458 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" event={"ID":"433bd27c-a67a-4487-b09e-523fd9b34b8f","Type":"ContainerStarted","Data":"61ad0ea74b98989627f624305ad7c0747e9d1d6db3ce941d9a353588a1af634c"} Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.657922 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.678951 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s5fmp" podStartSLOduration=82.678929315 podStartE2EDuration="1m22.678929315s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:18.655443087 +0000 UTC m=+143.872343402" watchObservedRunningTime="2026-03-20 08:25:18.678929315 +0000 UTC m=+143.895829630" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.680193 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-6smwd" podStartSLOduration=5.6801873910000005 podStartE2EDuration="5.680187391s" podCreationTimestamp="2026-03-20 08:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:18.678133834 +0000 UTC m=+143.895034169" watchObservedRunningTime="2026-03-20 08:25:18.680187391 +0000 UTC m=+143.897087706" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.723196 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" podStartSLOduration=81.723170484 podStartE2EDuration="1m21.723170484s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:18.72158531 +0000 UTC m=+143.938485625" watchObservedRunningTime="2026-03-20 08:25:18.723170484 +0000 UTC m=+143.940070799" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.730372 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:18 crc kubenswrapper[4903]: E0320 08:25:18.733157 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:19.233130913 +0000 UTC m=+144.450031408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.756829 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-h4m4s" podStartSLOduration=82.756809136 podStartE2EDuration="1m22.756809136s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:18.755569782 +0000 UTC m=+143.972470097" watchObservedRunningTime="2026-03-20 08:25:18.756809136 +0000 UTC m=+143.973709441" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.803493 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xzjq9" podStartSLOduration=82.803282617 podStartE2EDuration="1m22.803282617s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:18.802509996 +0000 UTC m=+144.019410311" watchObservedRunningTime="2026-03-20 08:25:18.803282617 +0000 UTC m=+144.020182932" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.831737 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:18 crc kubenswrapper[4903]: E0320 08:25:18.832208 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:19.332187147 +0000 UTC m=+144.549087462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.874820 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-p8nlt" podStartSLOduration=81.87479456 podStartE2EDuration="1m21.87479456s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:18.874321797 +0000 UTC m=+144.091222112" watchObservedRunningTime="2026-03-20 08:25:18.87479456 +0000 UTC m=+144.091694875" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.921372 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" podStartSLOduration=81.921344773 podStartE2EDuration="1m21.921344773s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:18.919119211 +0000 UTC m=+144.136019526" watchObservedRunningTime="2026-03-20 08:25:18.921344773 +0000 UTC m=+144.138245088" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.935605 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:18 crc kubenswrapper[4903]: E0320 08:25:18.936107 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:19.436078186 +0000 UTC m=+144.652978501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.988486 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-cc4f6" podStartSLOduration=81.988467053 podStartE2EDuration="1m21.988467053s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:18.960499729 +0000 UTC m=+144.177400044" watchObservedRunningTime="2026-03-20 08:25:18.988467053 +0000 UTC m=+144.205367368" Mar 20 08:25:18 crc kubenswrapper[4903]: I0320 08:25:18.992860 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.001946 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" podStartSLOduration=6.00191846 podStartE2EDuration="6.00191846s" podCreationTimestamp="2026-03-20 08:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:19.000470739 +0000 UTC m=+144.217371064" watchObservedRunningTime="2026-03-20 08:25:19.00191846 +0000 UTC m=+144.218818775" Mar 20 08:25:19 crc kubenswrapper[4903]: W0320 08:25:19.016628 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0568b2dd_f5ca_44ae_992d_1a5ed2e998fb.slice/crio-c19b99b0393ba810baaa77eceec4afa882f42fd461af0640ad8bc805032f2037 WatchSource:0}: Error finding container c19b99b0393ba810baaa77eceec4afa882f42fd461af0640ad8bc805032f2037: Status 404 returned error can't find the container with id c19b99b0393ba810baaa77eceec4afa882f42fd461af0640ad8bc805032f2037 Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.020839 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.036560 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:19 crc kubenswrapper[4903]: E0320 08:25:19.037084 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:19.537062914 +0000 UTC m=+144.753963229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.046150 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9wjwh"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.061721 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.111215 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.141898 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:19 crc kubenswrapper[4903]: E0320 08:25:19.142552 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:19.642535197 +0000 UTC m=+144.859435512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.174656 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pnnxx"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.250512 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.251665 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:19 crc kubenswrapper[4903]: E0320 08:25:19.252289 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:19.75226991 +0000 UTC m=+144.969170225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.302267 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs"] Mar 20 08:25:19 crc kubenswrapper[4903]: W0320 08:25:19.322466 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3211e594_bfe4_4eaf_ba13_48dbd7c8cf5e.slice/crio-ba6b35e3e51b010a76e2fe9abd43dc3315920b1fcdf0eba8aa59184104d833ff WatchSource:0}: Error finding container ba6b35e3e51b010a76e2fe9abd43dc3315920b1fcdf0eba8aa59184104d833ff: Status 404 returned error can't find the container with id ba6b35e3e51b010a76e2fe9abd43dc3315920b1fcdf0eba8aa59184104d833ff Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.328100 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-n4g5v"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.344943 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.371014 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.391960 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx6"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.372290 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:19 crc kubenswrapper[4903]: E0320 08:25:19.371703 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:19.871684533 +0000 UTC m=+145.088584848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.445466 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.462926 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5pjw9"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.473710 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.473780 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5v8xb"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.473792 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pjn9j"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.475698 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.481441 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:19 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:19 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:19 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.481553 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:19 crc kubenswrapper[4903]: W0320 08:25:19.489956 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9b6ad1a_4ed6_4df4_8f27_a8eb8d6e8511.slice/crio-2ede381a991a3226e4cbaff5e7d336925e58727a99bb806e3d0b5f819dd17cd4 WatchSource:0}: Error finding container 2ede381a991a3226e4cbaff5e7d336925e58727a99bb806e3d0b5f819dd17cd4: Status 404 returned error can't find the container with id 2ede381a991a3226e4cbaff5e7d336925e58727a99bb806e3d0b5f819dd17cd4 Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.493497 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:19 crc kubenswrapper[4903]: E0320 08:25:19.493999 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:19.993978057 +0000 UTC m=+145.210878372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.520405 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.588742 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.588803 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.606861 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:19 crc kubenswrapper[4903]: E0320 08:25:19.607364 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.107348852 +0000 UTC m=+145.324249167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.616871 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.617809 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn"] Mar 20 08:25:19 crc kubenswrapper[4903]: W0320 08:25:19.618881 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bfc26b7_6752_4984_90fa_ed0099d0627c.slice/crio-1f33cda93ba9ea6f59c4ad6c618056740c682b6e26719e6eb229ee77fdb494ff WatchSource:0}: Error finding container 1f33cda93ba9ea6f59c4ad6c618056740c682b6e26719e6eb229ee77fdb494ff: Status 404 returned error can't find the container with id 1f33cda93ba9ea6f59c4ad6c618056740c682b6e26719e6eb229ee77fdb494ff Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.641830 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb"] Mar 20 08:25:19 crc kubenswrapper[4903]: W0320 08:25:19.683079 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c53f142_6367_49fd_8577_6141d9f82f73.slice/crio-fcdd20bc8b2d982dde49332a4f34a4001d98bd9e5cb7c0b1aae5e99a37238567 WatchSource:0}: Error finding container fcdd20bc8b2d982dde49332a4f34a4001d98bd9e5cb7c0b1aae5e99a37238567: Status 404 returned error can't find the container with id fcdd20bc8b2d982dde49332a4f34a4001d98bd9e5cb7c0b1aae5e99a37238567 Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.709864 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:19 crc kubenswrapper[4903]: E0320 08:25:19.711560 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.21153338 +0000 UTC m=+145.428433695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.711848 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.721786 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.722768 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" event={"ID":"f20fdd99-f4e0-4435-98ef-7140cc1c9ca4","Type":"ContainerStarted","Data":"3715337c04bbd4d0a100d322b7ed1ee2a6e30aea702f6a64b738e847eaa96eaf"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.728372 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lw2n7"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.730495 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" event={"ID":"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511","Type":"ContainerStarted","Data":"2ede381a991a3226e4cbaff5e7d336925e58727a99bb806e3d0b5f819dd17cd4"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.740157 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" event={"ID":"21c3415f-b025-475e-a603-76329416df23","Type":"ContainerStarted","Data":"c4073327f1c67de36847f18b9aa2cc7709da710776f335a99377de9ec327080c"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.741898 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.751261 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbxgq"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.758124 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4zl69"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.764843 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" event={"ID":"433bd27c-a67a-4487-b09e-523fd9b34b8f","Type":"ContainerStarted","Data":"1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.777830 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" event={"ID":"9b45a827-576d-4fbb-87f4-4ae4d8af634e","Type":"ContainerStarted","Data":"ff4eb6543b97edeefa3a188df73e944d80f02aca13d5f030c53e00075ede9703"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.777885 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" event={"ID":"9b45a827-576d-4fbb-87f4-4ae4d8af634e","Type":"ContainerStarted","Data":"3a55783d9824d66fe0ce930f418f6340e22ff57af43956a01392aab3fa839cea"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.780537 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" event={"ID":"6534f377-ba61-4952-b722-1dcfe94df2ea","Type":"ContainerStarted","Data":"e97682dec1fef5b9cbf31be7b8d856d1842497af550774cff829487f1771944f"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.785796 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" event={"ID":"fe5d5814-39d5-43a9-9352-f47c80603c75","Type":"ContainerStarted","Data":"4685b4a63ea3fcaea8cdf43ed8fa2877ea3a7c755a6e0f3061ac4555cd07b37b"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.786175 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.787321 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pjn9j" event={"ID":"5bfc26b7-6752-4984-90fa-ed0099d0627c","Type":"ContainerStarted","Data":"1f33cda93ba9ea6f59c4ad6c618056740c682b6e26719e6eb229ee77fdb494ff"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.788444 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" event={"ID":"82c317fb-5d87-47b3-849c-58b0bab4d3ef","Type":"ContainerStarted","Data":"ff8ab62dfbbb5883b56be3fa4ade580d22532bfa4158fdd5493bf5701843757e"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.790023 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" event={"ID":"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb","Type":"ContainerStarted","Data":"c19b99b0393ba810baaa77eceec4afa882f42fd461af0640ad8bc805032f2037"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.801237 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" event={"ID":"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49","Type":"ContainerStarted","Data":"5e03236a626bd89723d3c33f5e0a6224922d46d8b2f5ecaa079b8ed8d5a5ab23"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.801329 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" event={"ID":"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49","Type":"ContainerStarted","Data":"d60ec3f99077693bce3262da7b1f2d0051be5ddaafa1c8d4fe0cce45974235ce"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.804133 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2hc2d" podStartSLOduration=83.804111772 podStartE2EDuration="1m23.804111772s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:19.801493958 +0000 UTC m=+145.018394273" watchObservedRunningTime="2026-03-20 08:25:19.804111772 +0000 UTC m=+145.021012077" Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.804344 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" podStartSLOduration=83.804338488 podStartE2EDuration="1m23.804338488s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:19.763858794 +0000 UTC m=+144.980759109" watchObservedRunningTime="2026-03-20 08:25:19.804338488 +0000 UTC m=+145.021238803" Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.811222 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5pjw9" event={"ID":"ca2849ec-27bf-41e4-b520-0ffc80e33e99","Type":"ContainerStarted","Data":"947b77cd26780d0714f875e2cf34d545f7f89169028f163f9e837b8ea7c37b9d"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.812161 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:19 crc kubenswrapper[4903]: E0320 08:25:19.813320 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.313306419 +0000 UTC m=+145.530206734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.832646 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.846599 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" event={"ID":"e9e65e64-f0de-46b5-a383-ccb51e989ba1","Type":"ContainerStarted","Data":"90231d7422fa616c767e46e7e9a57834a23b59e80e45c063e6108a80a1a25052"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.856344 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" podStartSLOduration=83.856319993 podStartE2EDuration="1m23.856319993s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:19.847180818 +0000 UTC m=+145.064081143" watchObservedRunningTime="2026-03-20 08:25:19.856319993 +0000 UTC m=+145.073220298" Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.856744 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vc9fr"] Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.867849 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" event={"ID":"1e7c0553-965b-49eb-8bcd-ad7e2096373c","Type":"ContainerStarted","Data":"427d6a83d47aef3ba886dbdd4149acb211d0168a4632d2225cca589c96184c26"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.867896 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" event={"ID":"1e7c0553-965b-49eb-8bcd-ad7e2096373c","Type":"ContainerStarted","Data":"e0cb78333e81a44a3a3c46e9424c2aff08589b32e88133207a2b057c23bcc082"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.918236 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:19 crc kubenswrapper[4903]: E0320 08:25:19.918330 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.418296049 +0000 UTC m=+145.635196364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.918628 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" event={"ID":"211b0d80-1cec-40f5-8679-91466edeb1a1","Type":"ContainerStarted","Data":"21550cce00ff2bf8cafdb921988931f93083428bc1e6fd486a028c8d694fee27"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.919445 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" event={"ID":"211b0d80-1cec-40f5-8679-91466edeb1a1","Type":"ContainerStarted","Data":"e50111d4e05c1091331b79708bb48b284df1fce740f96b00193f4309262b1efe"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.920115 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:19 crc kubenswrapper[4903]: E0320 08:25:19.920734 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.420681846 +0000 UTC m=+145.637582161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.937778 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" event={"ID":"3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e","Type":"ContainerStarted","Data":"ba6b35e3e51b010a76e2fe9abd43dc3315920b1fcdf0eba8aa59184104d833ff"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.945393 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-n4g5v" event={"ID":"7c273a7a-00f1-4e28-a9a6-69d240f2df29","Type":"ContainerStarted","Data":"4081a27369613f559287527e19982a74480066981904f089769b39a3e52d33c4"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.970823 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5" event={"ID":"80577d9f-8d2f-4cd9-8a6b-305ec6e2a612","Type":"ContainerStarted","Data":"b991288bb99d58ca4894f4bd4a0b72b4a159e83f077337e3ed433a493284a976"} Mar 20 08:25:19 crc kubenswrapper[4903]: I0320 08:25:19.972267 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-622vk" podStartSLOduration=82.972241839 podStartE2EDuration="1m22.972241839s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:19.941450357 +0000 UTC m=+145.158350672" watchObservedRunningTime="2026-03-20 08:25:19.972241839 +0000 UTC m=+145.189142154" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.020420 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" event={"ID":"6a89ca6c-6d44-4fa5-b728-09aa58890f99","Type":"ContainerStarted","Data":"4c76356e1c78873eeb4520f85995a1d7831145f80ad9673bcc257674ba9bc504"} Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.022699 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.024242 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.524215695 +0000 UTC m=+145.741116000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.028785 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.038834 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.048158 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qgf92" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.050963 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fhkp9" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.130363 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.136307 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.636281683 +0000 UTC m=+145.853182068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.239603 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.239821 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.739783171 +0000 UTC m=+145.956683486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.239877 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.240630 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.740622065 +0000 UTC m=+145.957522370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.344281 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.344674 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.844656277 +0000 UTC m=+146.061556592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.444306 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:20 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:20 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:20 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.444814 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.446159 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.446892 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:20.94687628 +0000 UTC m=+146.163776595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.548429 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.548835 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:21.048816114 +0000 UTC m=+146.265716419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.655806 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.656271 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:21.156257872 +0000 UTC m=+146.373158187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.758373 4903 ???:1] "http: TLS handshake error from 192.168.126.11:53094: no serving certificate available for the kubelet" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.759595 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.760102 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:21.26007572 +0000 UTC m=+146.476976035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.798027 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-spjl5"] Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.864018 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.864732 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:21.364716309 +0000 UTC m=+146.581616624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.865157 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.865191 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.873134 4903 ???:1] "http: TLS handshake error from 192.168.126.11:53104: no serving certificate available for the kubelet" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.966740 4903 ???:1] "http: TLS handshake error from 192.168.126.11:53106: no serving certificate available for the kubelet" Mar 20 08:25:20 crc kubenswrapper[4903]: I0320 08:25:20.976010 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:20 crc kubenswrapper[4903]: E0320 08:25:20.976481 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:21.476464008 +0000 UTC m=+146.693364313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.050531 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" event={"ID":"e9e65e64-f0de-46b5-a383-ccb51e989ba1","Type":"ContainerStarted","Data":"93ac7fb63d0b32b7c488a820f8d7527fd7e9f07fb97da4c04de71cb7b98ebc71"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.055747 4903 ???:1] "http: TLS handshake error from 192.168.126.11:53122: no serving certificate available for the kubelet" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.067704 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" event={"ID":"0b2ccb33-e3d5-46b8-863a-21ec88f7fb49","Type":"ContainerStarted","Data":"89f9a108fda8da1d726b19aff391b78b9fb8aa03243f2d9e823f7c122e176b1b"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.077273 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" event={"ID":"faa861a0-c46b-46d2-8f2f-6c8e70f403ec","Type":"ContainerStarted","Data":"f5af0f3f3b38b938d1998ee30c7d0a6c09e4d680c3f7296df98627e08852b4bb"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.078245 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:21 crc kubenswrapper[4903]: E0320 08:25:21.078651 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:21.578637159 +0000 UTC m=+146.795537474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.084236 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" event={"ID":"fd2cbd5a-d35b-4911-9748-57911dbb5bbb","Type":"ContainerStarted","Data":"c8907d9e202e5bb521b8527e0557604da54407aae07d1e6cce397a9edc80573f"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.094932 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" event={"ID":"4742c0a9-7786-4b7e-823e-e70630e72495","Type":"ContainerStarted","Data":"8bf7329c316f1e0a3a18b0ff0b487e59c79f3209a3557ab8ba91b201541751e3"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.105809 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2jwr" podStartSLOduration=84.10578488 podStartE2EDuration="1m24.10578488s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.104678639 +0000 UTC m=+146.321578954" watchObservedRunningTime="2026-03-20 08:25:21.10578488 +0000 UTC m=+146.322685195" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.111368 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" event={"ID":"ce1c215a-f629-4ec9-ac9b-100699560ce1","Type":"ContainerStarted","Data":"48cacdeeba3ee6ac3da74666dfbd70d9fe7461e8079ed73877f57e76392c45b4"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.141128 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.148765 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5" event={"ID":"80577d9f-8d2f-4cd9-8a6b-305ec6e2a612","Type":"ContainerStarted","Data":"d55fbe68051a71bcc21451cfc993a18ac5831aad5d320a3387a8a18df0d10c02"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.154385 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" event={"ID":"0ba96cb1-4795-45d3-a320-5ef3b87c644f","Type":"ContainerStarted","Data":"4179852b6f188b75fb889c4b33fc3753b2f6f88a6556d3437c82b71f182f5463"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.156719 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4zl69" event={"ID":"8d670e7f-bbda-4168-87a6-baa6ce35177b","Type":"ContainerStarted","Data":"8181bbea5e60eaa500042c9f412dea99f4106fbb83bbfddccd02c47ac123466c"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.172135 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" event={"ID":"2c53f142-6367-49fd-8577-6141d9f82f73","Type":"ContainerStarted","Data":"fcdd20bc8b2d982dde49332a4f34a4001d98bd9e5cb7c0b1aae5e99a37238567"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.186982 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:21 crc kubenswrapper[4903]: E0320 08:25:21.187588 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:21.687509578 +0000 UTC m=+146.904409893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.188467 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:21 crc kubenswrapper[4903]: E0320 08:25:21.189855 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:21.689842374 +0000 UTC m=+146.906742689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.190798 4903 ???:1] "http: TLS handshake error from 192.168.126.11:53134: no serving certificate available for the kubelet" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.202245 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" event={"ID":"b9b6ad1a-4ed6-4df4-8f27-a8eb8d6e8511","Type":"ContainerStarted","Data":"ec1caa682364facdc3073764c95d01e7cc8f50d0e646838c39d7cab28b98a701"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.209471 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" event={"ID":"c2c16cc7-f269-4f15-b84f-9b46b14cd853","Type":"ContainerStarted","Data":"08eb2258a814deefd9f8ef62e0052ea67817e4e3b54ed153f265675d95818610"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.278662 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" event={"ID":"1e7c0553-965b-49eb-8bcd-ad7e2096373c","Type":"ContainerStarted","Data":"62fa7a76c9f00b93670558a06e82e3c3c142ffddb1a6e10964c7b9f3b7b322f8"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.289173 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:21 crc kubenswrapper[4903]: E0320 08:25:21.292896 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:21.792868558 +0000 UTC m=+147.009768863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.293763 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9hf5p" podStartSLOduration=84.293741283 podStartE2EDuration="1m24.293741283s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.291122459 +0000 UTC m=+146.508022784" watchObservedRunningTime="2026-03-20 08:25:21.293741283 +0000 UTC m=+146.510641588" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.300145 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" event={"ID":"60e17360-c395-4a9b-b371-2aeeccf0b70f","Type":"ContainerStarted","Data":"891f783f2157614ecc431263a79c654fdecc96a1ea51dff733915d0d28339685"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.302501 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" event={"ID":"6534f377-ba61-4952-b722-1dcfe94df2ea","Type":"ContainerStarted","Data":"cafa7cc9cd9260b20b460dc9508925e7f0de68f60a681fc72f265bab3e3a0fcd"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.306518 4903 ???:1] "http: TLS handshake error from 192.168.126.11:53138: no serving certificate available for the kubelet" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.307937 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" event={"ID":"32d4db0f-b5bb-41d4-ae0d-0600e38892b1","Type":"ContainerStarted","Data":"86c5da429e3a63be1fb74c86e28e65220803d49d3c527722cb15b379e05853ac"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.323660 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" event={"ID":"21c3415f-b025-475e-a603-76329416df23","Type":"ContainerStarted","Data":"9d227f353d1f7c12fad50f0f5e73e0ae8f57ecc07a44818f38ba45569659d272"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.349190 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" podStartSLOduration=84.349166094 podStartE2EDuration="1m24.349166094s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.346795889 +0000 UTC m=+146.563696194" watchObservedRunningTime="2026-03-20 08:25:21.349166094 +0000 UTC m=+146.566066409" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.367823 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" event={"ID":"0568b2dd-f5ca-44ae-992d-1a5ed2e998fb","Type":"ContainerStarted","Data":"c7c0dd1dd942eb254bb947fc9a32eb75dfb3ff681163d549804a5ccc1cd416c2"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.384272 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" event={"ID":"6a89ca6c-6d44-4fa5-b728-09aa58890f99","Type":"ContainerStarted","Data":"7d3476913e70b15fa874bf7c058755e8dfabe0f9e9e43bcb2e1d1babb950e477"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.392459 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:21 crc kubenswrapper[4903]: E0320 08:25:21.396882 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:21.89685961 +0000 UTC m=+147.113759925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.402989 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mmtf2" podStartSLOduration=85.402969341 podStartE2EDuration="1m25.402969341s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.399955827 +0000 UTC m=+146.616856142" watchObservedRunningTime="2026-03-20 08:25:21.402969341 +0000 UTC m=+146.619869656" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.413501 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" event={"ID":"82c317fb-5d87-47b3-849c-58b0bab4d3ef","Type":"ContainerStarted","Data":"2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.422300 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.416243 4903 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-5dpx6 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.422384 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" podUID="82c317fb-5d87-47b3-849c-58b0bab4d3ef" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.455614 4903 ???:1] "http: TLS handshake error from 192.168.126.11:53152: no serving certificate available for the kubelet" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.478567 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:21 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:21 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:21 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.478639 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.478767 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" event={"ID":"3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e","Type":"ContainerStarted","Data":"41dfac53ed21fcf334f87777f83e748d247ed69497852baee4459932e08f1952"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.479579 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.499338 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:21 crc kubenswrapper[4903]: E0320 08:25:21.501078 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.001054728 +0000 UTC m=+147.217955043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.512675 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7gchs" podStartSLOduration=84.512656762 podStartE2EDuration="1m24.512656762s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.46507674 +0000 UTC m=+146.681977055" watchObservedRunningTime="2026-03-20 08:25:21.512656762 +0000 UTC m=+146.729557077" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.515956 4903 patch_prober.go:28] interesting pod/downloads-7954f5f757-n4g5v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.516056 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n4g5v" podUID="7c273a7a-00f1-4e28-a9a6-69d240f2df29" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.524481 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9wjwh" podStartSLOduration=84.524455923 podStartE2EDuration="1m24.524455923s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.512487358 +0000 UTC m=+146.729387673" watchObservedRunningTime="2026-03-20 08:25:21.524455923 +0000 UTC m=+146.741356238" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.527355 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-n4g5v" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.527412 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-n4g5v" event={"ID":"7c273a7a-00f1-4e28-a9a6-69d240f2df29","Type":"ContainerStarted","Data":"2dbf5fd1754fe98a4c1449b40ca9b0b78f1c2bbf9c5e3dad87679bf8a9b209d3"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.544766 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vc9fr" event={"ID":"d7385d29-60c8-459c-8d9f-f87f697b4dcb","Type":"ContainerStarted","Data":"d345113a3a403728922c3a495451e64747579db8c9fd5ca6de0ec43df5eb26d0"} Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.556708 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-t7qmg" podStartSLOduration=84.556688326 podStartE2EDuration="1m24.556688326s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.555239135 +0000 UTC m=+146.772139450" watchObservedRunningTime="2026-03-20 08:25:21.556688326 +0000 UTC m=+146.773588641" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.573481 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d5tc5" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.584762 4903 ???:1] "http: TLS handshake error from 192.168.126.11:53154: no serving certificate available for the kubelet" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.584766 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-687pt" podStartSLOduration=84.584741511 podStartE2EDuration="1m24.584741511s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.582396355 +0000 UTC m=+146.799296680" watchObservedRunningTime="2026-03-20 08:25:21.584741511 +0000 UTC m=+146.801641826" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.614389 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:21 crc kubenswrapper[4903]: E0320 08:25:21.616138 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.116119239 +0000 UTC m=+147.333019554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.657373 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" podStartSLOduration=84.657348643 podStartE2EDuration="1m24.657348643s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.611806099 +0000 UTC m=+146.828706414" watchObservedRunningTime="2026-03-20 08:25:21.657348643 +0000 UTC m=+146.874248968" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.704914 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-n4g5v" podStartSLOduration=85.704889415 podStartE2EDuration="1m25.704889415s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.704752251 +0000 UTC m=+146.921652566" watchObservedRunningTime="2026-03-20 08:25:21.704889415 +0000 UTC m=+146.921789750" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.709099 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" podStartSLOduration=85.709087582 podStartE2EDuration="1m25.709087582s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:21.659912755 +0000 UTC m=+146.876813080" watchObservedRunningTime="2026-03-20 08:25:21.709087582 +0000 UTC m=+146.925987917" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.723551 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:21 crc kubenswrapper[4903]: E0320 08:25:21.725502 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.225468471 +0000 UTC m=+147.442368856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.791688 4903 patch_prober.go:28] interesting pod/apiserver-76f77b778f-fcqtt container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 08:25:21 crc kubenswrapper[4903]: [+]log ok Mar 20 08:25:21 crc kubenswrapper[4903]: [+]etcd ok Mar 20 08:25:21 crc kubenswrapper[4903]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 08:25:21 crc kubenswrapper[4903]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 08:25:21 crc kubenswrapper[4903]: [+]poststarthook/max-in-flight-filter ok Mar 20 08:25:21 crc kubenswrapper[4903]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 08:25:21 crc kubenswrapper[4903]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 08:25:21 crc kubenswrapper[4903]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 08:25:21 crc kubenswrapper[4903]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 08:25:21 crc kubenswrapper[4903]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 08:25:21 crc kubenswrapper[4903]: [-]poststarthook/project.openshift.io-projectauthorizationcache failed: reason withheld Mar 20 08:25:21 crc kubenswrapper[4903]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 20 08:25:21 crc kubenswrapper[4903]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 08:25:21 crc kubenswrapper[4903]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 08:25:21 crc kubenswrapper[4903]: livez check failed Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.791761 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" podUID="f20fdd99-f4e0-4435-98ef-7140cc1c9ca4" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.833152 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:21 crc kubenswrapper[4903]: E0320 08:25:21.833571 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.333557728 +0000 UTC m=+147.550458043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.876432 4903 ???:1] "http: TLS handshake error from 192.168.126.11:53164: no serving certificate available for the kubelet" Mar 20 08:25:21 crc kubenswrapper[4903]: I0320 08:25:21.935106 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:21 crc kubenswrapper[4903]: E0320 08:25:21.935640 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.435616635 +0000 UTC m=+147.652516950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.040256 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.041048 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.541017826 +0000 UTC m=+147.757918141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.142066 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.142573 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.64255422 +0000 UTC m=+147.859454535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.243493 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.243982 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.743960569 +0000 UTC m=+147.960860884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.244103 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.244149 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.247374 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.255717 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.345828 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.346133 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.846076759 +0000 UTC m=+148.062977074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.346227 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.346415 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.346600 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.846583543 +0000 UTC m=+148.063483858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.346690 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.355180 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.356135 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.417292 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.433594 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.448573 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:22 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:22 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:22 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.448654 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.448924 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.449314 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.449623 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:22.949565216 +0000 UTC m=+148.166465701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.550649 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.551204 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:23.051183592 +0000 UTC m=+148.268083907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.604627 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5v8xb" event={"ID":"60e17360-c395-4a9b-b371-2aeeccf0b70f","Type":"ContainerStarted","Data":"1249654f95c22bce62276c4c6b40a8a87a81c149491f6c44c384e3d236ccd494"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.653278 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.653682 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:23.153661911 +0000 UTC m=+148.370562226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.665633 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" event={"ID":"3211e594-bfe4-4eaf-ba13-48dbd7c8cf5e","Type":"ContainerStarted","Data":"9cf015b5af216033face8df585b3d54acdded6e508822a502f73a45c489bb3be"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.695144 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f47pf"] Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.695880 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" podUID="a01da421-9bf1-459f-a419-c7cc271bf472" containerName="controller-manager" containerID="cri-o://eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d" gracePeriod=30 Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.713235 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5pjw9" event={"ID":"ca2849ec-27bf-41e4-b520-0ffc80e33e99","Type":"ContainerStarted","Data":"f10b3fc90ee4741e43820a5526c95421fe16fc823de0fa4b3d6dfbf7079decd2"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.713290 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.716713 4903 patch_prober.go:28] interesting pod/console-operator-58897d9998-5pjw9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/readyz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.716781 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5pjw9" podUID="ca2849ec-27bf-41e4-b520-0ffc80e33e99" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/readyz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.733948 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" event={"ID":"0ba96cb1-4795-45d3-a320-5ef3b87c644f","Type":"ContainerStarted","Data":"ec007fa062bb64bdf3d8add2ebd726880edf738ee8cbc09d9d1a53e429bfd5f8"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.734033 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" event={"ID":"0ba96cb1-4795-45d3-a320-5ef3b87c644f","Type":"ContainerStarted","Data":"9ed17cb073492118a2941ffe92d3c8c42b71053ec460d850d681896f1fbb3671"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.757965 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.759347 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:23.259330021 +0000 UTC m=+148.476230336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.766270 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" event={"ID":"fd2cbd5a-d35b-4911-9748-57911dbb5bbb","Type":"ContainerStarted","Data":"c595d07e67d66777c87b93889283cc4596d8baeb755ed14ced0861a50af20210"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.766310 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" event={"ID":"fd2cbd5a-d35b-4911-9748-57911dbb5bbb","Type":"ContainerStarted","Data":"0d65c75632f03730b713a624fe952a7cb687f1c0995c64ad7d480cacd0ca2a1a"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.790292 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5pjw9" podStartSLOduration=86.790260437 podStartE2EDuration="1m26.790260437s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:22.77644683 +0000 UTC m=+147.993347155" watchObservedRunningTime="2026-03-20 08:25:22.790260437 +0000 UTC m=+148.007160752" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.817950 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" event={"ID":"c2c16cc7-f269-4f15-b84f-9b46b14cd853","Type":"ContainerStarted","Data":"4b1f217a8babb539ee4e62e476de27d7f4f578216873109a416321d7a63bd718"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.818021 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" event={"ID":"c2c16cc7-f269-4f15-b84f-9b46b14cd853","Type":"ContainerStarted","Data":"f9467e55417e74de2a4d7f940345baae74b837d0b54334dbbe982316423bcc23"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.838956 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d"] Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.842206 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pjn9j" event={"ID":"5bfc26b7-6752-4984-90fa-ed0099d0627c","Type":"ContainerStarted","Data":"fc7c74b42b6fa3ebbff564645924d192ba3c772730959d8e5173445fbf51909f"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.860109 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.862007 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:23.361981215 +0000 UTC m=+148.578881530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.879526 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-sq6kr" podStartSLOduration=85.879498765 podStartE2EDuration="1m25.879498765s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:22.837142839 +0000 UTC m=+148.054043154" watchObservedRunningTime="2026-03-20 08:25:22.879498765 +0000 UTC m=+148.096399080" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.908186 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" event={"ID":"e9e65e64-f0de-46b5-a383-ccb51e989ba1","Type":"ContainerStarted","Data":"8996e1a7da9905fcc98e8cd22a2841a9e4940bf930c7da25459a062ec70c3f67"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.957020 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" event={"ID":"2c53f142-6367-49fd-8577-6141d9f82f73","Type":"ContainerStarted","Data":"d1e49349611f2ecc9868293e4a3fbe657418e6c517d143d0504fc991057da93c"} Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.957020 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sjg9x" podStartSLOduration=85.956984335 podStartE2EDuration="1m25.956984335s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:22.93037736 +0000 UTC m=+148.147277675" watchObservedRunningTime="2026-03-20 08:25:22.956984335 +0000 UTC m=+148.173884650" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.957396 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:22 crc kubenswrapper[4903]: I0320 08:25:22.982741 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:22 crc kubenswrapper[4903]: E0320 08:25:22.984448 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:23.484432483 +0000 UTC m=+148.701332798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.005091 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pjn9j" podStartSLOduration=10.00502244 podStartE2EDuration="10.00502244s" podCreationTimestamp="2026-03-20 08:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:23.004494426 +0000 UTC m=+148.221394731" watchObservedRunningTime="2026-03-20 08:25:23.00502244 +0000 UTC m=+148.221922755" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.026840 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" event={"ID":"32d4db0f-b5bb-41d4-ae0d-0600e38892b1","Type":"ContainerStarted","Data":"886211f260ddc7606b6b1150cd5eb273e9a18c4f9250e304efeecdb3a9b83794"} Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.030584 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.063628 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-pnnxx" podStartSLOduration=86.06360098 podStartE2EDuration="1m26.06360098s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:23.043154817 +0000 UTC m=+148.260055132" watchObservedRunningTime="2026-03-20 08:25:23.06360098 +0000 UTC m=+148.280501295" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.072482 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4zl69" event={"ID":"8d670e7f-bbda-4168-87a6-baa6ce35177b","Type":"ContainerStarted","Data":"732f9743cbcc868ff6a6e3b814374a71668d69de0fbd26c81d45a88aab7aada6"} Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.085790 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:23 crc kubenswrapper[4903]: E0320 08:25:23.087884 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:23.587865089 +0000 UTC m=+148.804765404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.118604 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" event={"ID":"4742c0a9-7786-4b7e-823e-e70630e72495","Type":"ContainerStarted","Data":"3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73"} Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.119731 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.133580 4903 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zbxgq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.133637 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" podUID="4742c0a9-7786-4b7e-823e-e70630e72495" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.146006 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" event={"ID":"ce1c215a-f629-4ec9-ac9b-100699560ce1","Type":"ContainerStarted","Data":"78686dacdf2ee411ff89b233114cfca5ff3c0f0131e21007f596f020aa98a7fe"} Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.147153 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.183351 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lw2n7" podStartSLOduration=86.183332362 podStartE2EDuration="1m26.183332362s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:23.118191879 +0000 UTC m=+148.335092194" watchObservedRunningTime="2026-03-20 08:25:23.183332362 +0000 UTC m=+148.400232677" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.187491 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:23 crc kubenswrapper[4903]: E0320 08:25:23.187846 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:23.687833259 +0000 UTC m=+148.904733564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.196301 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5" event={"ID":"80577d9f-8d2f-4cd9-8a6b-305ec6e2a612","Type":"ContainerStarted","Data":"23635a7f4434dbc9a9f3c6224ecfc5d8e9e93627b5357aa140029415add7fb5e"} Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.219547 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-skg8s" podStartSLOduration=86.219529036 podStartE2EDuration="1m26.219529036s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:23.18642298 +0000 UTC m=+148.403323295" watchObservedRunningTime="2026-03-20 08:25:23.219529036 +0000 UTC m=+148.436429351" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.225485 4903 ???:1] "http: TLS handshake error from 192.168.126.11:53170: no serving certificate available for the kubelet" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.229613 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vc9fr" event={"ID":"d7385d29-60c8-459c-8d9f-f87f697b4dcb","Type":"ContainerStarted","Data":"1c0e5d8cf46f068da8ee2b624cc8e443c71229a3a5dcff7f6229efe5f3015661"} Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.229659 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vc9fr" event={"ID":"d7385d29-60c8-459c-8d9f-f87f697b4dcb","Type":"ContainerStarted","Data":"281af752179dea69a6f71db4b488dce8e758e0fbca8fd421cc7863d44ee0003f"} Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.230365 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.240295 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.267911 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" podStartSLOduration=87.267887741 podStartE2EDuration="1m27.267887741s" podCreationTimestamp="2026-03-20 08:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:23.229008222 +0000 UTC m=+148.445908537" watchObservedRunningTime="2026-03-20 08:25:23.267887741 +0000 UTC m=+148.484788056" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.268524 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" podStartSLOduration=86.268519578 podStartE2EDuration="1m26.268519578s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:23.265643007 +0000 UTC m=+148.482543312" watchObservedRunningTime="2026-03-20 08:25:23.268519578 +0000 UTC m=+148.485419893" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.278221 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" podUID="433bd27c-a67a-4487-b09e-523fd9b34b8f" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" gracePeriod=30 Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.278606 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" event={"ID":"faa861a0-c46b-46d2-8f2f-6c8e70f403ec","Type":"ContainerStarted","Data":"f651af0b03e4e93107f2b7f6905e93191c4bc59f86614939ba5747f3c888c0c6"} Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.281658 4903 patch_prober.go:28] interesting pod/downloads-7954f5f757-n4g5v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.281701 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n4g5v" podUID="7c273a7a-00f1-4e28-a9a6-69d240f2df29" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.293601 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:23 crc kubenswrapper[4903]: E0320 08:25:23.295637 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:23.795606117 +0000 UTC m=+149.012506432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.314728 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qqtv5" podStartSLOduration=86.314705401 podStartE2EDuration="1m26.314705401s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:23.299495685 +0000 UTC m=+148.516396000" watchObservedRunningTime="2026-03-20 08:25:23.314705401 +0000 UTC m=+148.531605706" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.315114 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.396950 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k27wn" podStartSLOduration=86.396918494 podStartE2EDuration="1m26.396918494s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:23.352017886 +0000 UTC m=+148.568918201" watchObservedRunningTime="2026-03-20 08:25:23.396918494 +0000 UTC m=+148.613818809" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.397634 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:23 crc kubenswrapper[4903]: E0320 08:25:23.401877 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:23.901852881 +0000 UTC m=+149.118753396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:23 crc kubenswrapper[4903]: W0320 08:25:23.430350 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-d11a69977b0fa1777c2224e486bee6982030c33919607c7980aad7c61c16ef63 WatchSource:0}: Error finding container d11a69977b0fa1777c2224e486bee6982030c33919607c7980aad7c61c16ef63: Status 404 returned error can't find the container with id d11a69977b0fa1777c2224e486bee6982030c33919607c7980aad7c61c16ef63 Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.448261 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:23 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:23 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:23 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.448308 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:23 crc kubenswrapper[4903]: W0320 08:25:23.487734 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-763c305f41f9ad622cf6350cf7c3d498cfe29dfac6c4c790a9fb8a4385952114 WatchSource:0}: Error finding container 763c305f41f9ad622cf6350cf7c3d498cfe29dfac6c4c790a9fb8a4385952114: Status 404 returned error can't find the container with id 763c305f41f9ad622cf6350cf7c3d498cfe29dfac6c4c790a9fb8a4385952114 Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.498419 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vc9fr" podStartSLOduration=10.498398905 podStartE2EDuration="10.498398905s" podCreationTimestamp="2026-03-20 08:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:23.496509822 +0000 UTC m=+148.713410137" watchObservedRunningTime="2026-03-20 08:25:23.498398905 +0000 UTC m=+148.715299240" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.499447 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:23 crc kubenswrapper[4903]: E0320 08:25:23.499785 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:23.999760853 +0000 UTC m=+149.216661178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.581629 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.606697 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:23 crc kubenswrapper[4903]: E0320 08:25:23.607090 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:24.107076718 +0000 UTC m=+149.323977033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.607563 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-9zwz2" podStartSLOduration=86.60753485 podStartE2EDuration="1m26.60753485s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:23.548711073 +0000 UTC m=+148.765611388" watchObservedRunningTime="2026-03-20 08:25:23.60753485 +0000 UTC m=+148.824435165" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.707623 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-client-ca\") pod \"a01da421-9bf1-459f-a419-c7cc271bf472\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.708113 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-proxy-ca-bundles\") pod \"a01da421-9bf1-459f-a419-c7cc271bf472\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.708338 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.708381 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a01da421-9bf1-459f-a419-c7cc271bf472-serving-cert\") pod \"a01da421-9bf1-459f-a419-c7cc271bf472\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.708402 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pm7f\" (UniqueName: \"kubernetes.io/projected/a01da421-9bf1-459f-a419-c7cc271bf472-kube-api-access-2pm7f\") pod \"a01da421-9bf1-459f-a419-c7cc271bf472\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.712514 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-config\") pod \"a01da421-9bf1-459f-a419-c7cc271bf472\" (UID: \"a01da421-9bf1-459f-a419-c7cc271bf472\") " Mar 20 08:25:23 crc kubenswrapper[4903]: E0320 08:25:23.715401 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:24.215335699 +0000 UTC m=+149.432236014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.718463 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-config" (OuterVolumeSpecName: "config") pod "a01da421-9bf1-459f-a419-c7cc271bf472" (UID: "a01da421-9bf1-459f-a419-c7cc271bf472"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.718511 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-client-ca" (OuterVolumeSpecName: "client-ca") pod "a01da421-9bf1-459f-a419-c7cc271bf472" (UID: "a01da421-9bf1-459f-a419-c7cc271bf472"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.725321 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a01da421-9bf1-459f-a419-c7cc271bf472-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a01da421-9bf1-459f-a419-c7cc271bf472" (UID: "a01da421-9bf1-459f-a419-c7cc271bf472"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.725902 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a01da421-9bf1-459f-a419-c7cc271bf472" (UID: "a01da421-9bf1-459f-a419-c7cc271bf472"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.780289 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01da421-9bf1-459f-a419-c7cc271bf472-kube-api-access-2pm7f" (OuterVolumeSpecName: "kube-api-access-2pm7f") pod "a01da421-9bf1-459f-a419-c7cc271bf472" (UID: "a01da421-9bf1-459f-a419-c7cc271bf472"). InnerVolumeSpecName "kube-api-access-2pm7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.817253 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.817364 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.817381 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.817392 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a01da421-9bf1-459f-a419-c7cc271bf472-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.817401 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pm7f\" (UniqueName: \"kubernetes.io/projected/a01da421-9bf1-459f-a419-c7cc271bf472-kube-api-access-2pm7f\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.817410 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a01da421-9bf1-459f-a419-c7cc271bf472-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:23 crc kubenswrapper[4903]: E0320 08:25:23.817746 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:24.317732586 +0000 UTC m=+149.534632901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:23 crc kubenswrapper[4903]: I0320 08:25:23.918778 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:23 crc kubenswrapper[4903]: E0320 08:25:23.919914 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:24.419892337 +0000 UTC m=+149.636792642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.020956 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:24 crc kubenswrapper[4903]: E0320 08:25:24.021448 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:24.52143337 +0000 UTC m=+149.738333685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.122308 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:24 crc kubenswrapper[4903]: E0320 08:25:24.122577 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:24.622545841 +0000 UTC m=+149.839446156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.123050 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:24 crc kubenswrapper[4903]: E0320 08:25:24.123433 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:24.623417286 +0000 UTC m=+149.840317601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.224230 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:24 crc kubenswrapper[4903]: E0320 08:25:24.224691 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:24.724656051 +0000 UTC m=+149.941556366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.285502 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8c616da1ae1900cb421929031946f4163872e78979de1c2a3a576dcd59d37d30"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.285556 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"15ed2385f650c4159c00d192b2452d0190e0c12f64cc8cbd34a006246d08c937"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.285806 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.288959 4903 generic.go:334] "Generic (PLEG): container finished" podID="32d4db0f-b5bb-41d4-ae0d-0600e38892b1" containerID="886211f260ddc7606b6b1150cd5eb273e9a18c4f9250e304efeecdb3a9b83794" exitCode=0 Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.289239 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" event={"ID":"32d4db0f-b5bb-41d4-ae0d-0600e38892b1","Type":"ContainerDied","Data":"886211f260ddc7606b6b1150cd5eb273e9a18c4f9250e304efeecdb3a9b83794"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.291194 4903 generic.go:334] "Generic (PLEG): container finished" podID="a01da421-9bf1-459f-a419-c7cc271bf472" containerID="eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d" exitCode=0 Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.291252 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.291280 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" event={"ID":"a01da421-9bf1-459f-a419-c7cc271bf472","Type":"ContainerDied","Data":"eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.291316 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-f47pf" event={"ID":"a01da421-9bf1-459f-a419-c7cc271bf472","Type":"ContainerDied","Data":"0e76442bc0ec992a73f79f3e063551920fe36cfffd71e68382d99133af69f47a"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.291337 4903 scope.go:117] "RemoveContainer" containerID="eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.298580 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2616ff1d7244f4c0c6e1f77813b49326713852b6e657978de0c0dfd43ed1eaf0"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.298639 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"763c305f41f9ad622cf6350cf7c3d498cfe29dfac6c4c790a9fb8a4385952114"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.321441 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4zl69" event={"ID":"8d670e7f-bbda-4168-87a6-baa6ce35177b","Type":"ContainerStarted","Data":"0c64599a9b5a661ca2de4408f676184ffbbfe1a1cc1366dfc49cd15102b6a0b2"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.321491 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4zl69" event={"ID":"8d670e7f-bbda-4168-87a6-baa6ce35177b","Type":"ContainerStarted","Data":"b209abe9db4ce6d41d4a017574773fc5596d9260fc9be24583538b3eb83e3917"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.324840 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d443bdb399fb6d853b98ade76fb6f4fe9d56c6d493eac267d679917233dd8b9a"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.324906 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d11a69977b0fa1777c2224e486bee6982030c33919607c7980aad7c61c16ef63"} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.327732 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" podUID="a21d7057-af5a-499a-b71f-3cae88b6883e" containerName="route-controller-manager" containerID="cri-o://bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2" gracePeriod=30 Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.327864 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:24 crc kubenswrapper[4903]: E0320 08:25:24.328313 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:24.828298703 +0000 UTC m=+150.045199018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.339854 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5pjw9" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.341632 4903 scope.go:117] "RemoveContainer" containerID="eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d" Mar 20 08:25:24 crc kubenswrapper[4903]: E0320 08:25:24.341960 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d\": container with ID starting with eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d not found: ID does not exist" containerID="eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.341995 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d"} err="failed to get container status \"eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d\": rpc error: code = NotFound desc = could not find container \"eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d\": container with ID starting with eb54b02e2ed004f48b9cd02de61ec6fb4644d05cc1e7ea8a63e5e6245023787d not found: ID does not exist" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.354133 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.361563 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d874b96b-vslql"] Mar 20 08:25:24 crc kubenswrapper[4903]: E0320 08:25:24.361861 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01da421-9bf1-459f-a419-c7cc271bf472" containerName="controller-manager" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.361880 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01da421-9bf1-459f-a419-c7cc271bf472" containerName="controller-manager" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.362052 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01da421-9bf1-459f-a419-c7cc271bf472" containerName="controller-manager" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.362595 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.365528 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.365799 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.365964 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.368690 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.371880 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.372014 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.379641 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d874b96b-vslql"] Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.406368 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.420024 4903 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.431715 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.431940 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-client-ca\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.432020 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-config\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.432380 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578afe8a-2374-4893-bd24-5048cd759a3a-serving-cert\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.432561 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-proxy-ca-bundles\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.432619 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j9lj\" (UniqueName: \"kubernetes.io/projected/578afe8a-2374-4893-bd24-5048cd759a3a-kube-api-access-9j9lj\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: E0320 08:25:24.432743 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:24.932720967 +0000 UTC m=+150.149621282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.443476 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:24 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:24 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:24 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.443959 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.534926 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-proxy-ca-bundles\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.534965 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j9lj\" (UniqueName: \"kubernetes.io/projected/578afe8a-2374-4893-bd24-5048cd759a3a-kube-api-access-9j9lj\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.534992 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-client-ca\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.535012 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-config\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.535148 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.535170 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578afe8a-2374-4893-bd24-5048cd759a3a-serving-cert\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.536390 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-client-ca\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.537450 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-config\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: E0320 08:25:24.537761 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 08:25:25.037744448 +0000 UTC m=+150.254644753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-2q9m5" (UID: "5778224c-9b34-45c0-9812-122b95cef431") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.545389 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-proxy-ca-bundles\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.564260 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578afe8a-2374-4893-bd24-5048cd759a3a-serving-cert\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.572711 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f47pf"] Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.573101 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j9lj\" (UniqueName: \"kubernetes.io/projected/578afe8a-2374-4893-bd24-5048cd759a3a-kube-api-access-9j9lj\") pod \"controller-manager-6d874b96b-vslql\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.596724 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f47pf"] Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.636733 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:24 crc kubenswrapper[4903]: E0320 08:25:24.637169 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 08:25:25.137148961 +0000 UTC m=+150.354049276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.679271 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nwkzx"] Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.681128 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.685938 4903 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T08:25:24.420062913Z","Handler":null,"Name":""} Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.687797 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.702860 4903 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.702922 4903 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.703188 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.721191 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nwkzx"] Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.739716 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-catalog-content\") pod \"certified-operators-nwkzx\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.739924 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g98zj\" (UniqueName: \"kubernetes.io/projected/e74b75cf-cad8-4b67-91b7-3926096e09f8-kube-api-access-g98zj\") pod \"certified-operators-nwkzx\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.740058 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-utilities\") pod \"certified-operators-nwkzx\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.740157 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.751542 4903 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.753260 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.763503 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.765843 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.773741 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.774223 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.774682 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.823197 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-2q9m5\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.842770 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.843122 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-catalog-content\") pod \"certified-operators-nwkzx\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.843163 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g98zj\" (UniqueName: \"kubernetes.io/projected/e74b75cf-cad8-4b67-91b7-3926096e09f8-kube-api-access-g98zj\") pod \"certified-operators-nwkzx\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.843186 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa79c2ee-4638-4400-8644-3b28f91e8497-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fa79c2ee-4638-4400-8644-3b28f91e8497\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.843209 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-utilities\") pod \"certified-operators-nwkzx\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.843233 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa79c2ee-4638-4400-8644-3b28f91e8497-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fa79c2ee-4638-4400-8644-3b28f91e8497\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.843996 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-catalog-content\") pod \"certified-operators-nwkzx\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.844574 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-utilities\") pod \"certified-operators-nwkzx\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.862819 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fzcbm"] Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.864026 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.872645 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g98zj\" (UniqueName: \"kubernetes.io/projected/e74b75cf-cad8-4b67-91b7-3926096e09f8-kube-api-access-g98zj\") pod \"certified-operators-nwkzx\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.877110 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.877457 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.877923 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fzcbm"] Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.945097 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6svbl\" (UniqueName: \"kubernetes.io/projected/738071fe-1a7b-403b-ab94-8e88d5d79ab4-kube-api-access-6svbl\") pod \"community-operators-fzcbm\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.945570 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-catalog-content\") pod \"community-operators-fzcbm\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.945649 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-utilities\") pod \"community-operators-fzcbm\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.945685 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa79c2ee-4638-4400-8644-3b28f91e8497-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fa79c2ee-4638-4400-8644-3b28f91e8497\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.945706 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa79c2ee-4638-4400-8644-3b28f91e8497-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fa79c2ee-4638-4400-8644-3b28f91e8497\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.945780 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa79c2ee-4638-4400-8644-3b28f91e8497-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fa79c2ee-4638-4400-8644-3b28f91e8497\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 08:25:24 crc kubenswrapper[4903]: I0320 08:25:24.963623 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa79c2ee-4638-4400-8644-3b28f91e8497-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fa79c2ee-4638-4400-8644-3b28f91e8497\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.031959 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.055011 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d874b96b-vslql"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.057566 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-utilities\") pod \"community-operators-fzcbm\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.057643 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6svbl\" (UniqueName: \"kubernetes.io/projected/738071fe-1a7b-403b-ab94-8e88d5d79ab4-kube-api-access-6svbl\") pod \"community-operators-fzcbm\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.057677 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-catalog-content\") pod \"community-operators-fzcbm\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.058273 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bmcws"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.060240 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-utilities\") pod \"community-operators-fzcbm\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.060634 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.062331 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-catalog-content\") pod \"community-operators-fzcbm\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.074246 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmcws"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.074624 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.092638 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6svbl\" (UniqueName: \"kubernetes.io/projected/738071fe-1a7b-403b-ab94-8e88d5d79ab4-kube-api-access-6svbl\") pod \"community-operators-fzcbm\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.110519 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.111051 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.161323 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsb9h\" (UniqueName: \"kubernetes.io/projected/d209b7b2-53ad-4780-a13e-65d2b0cb5189-kube-api-access-zsb9h\") pod \"certified-operators-bmcws\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.161799 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-utilities\") pod \"certified-operators-bmcws\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.161863 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-catalog-content\") pod \"certified-operators-bmcws\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.225238 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.257076 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l45gc"] Mar 20 08:25:25 crc kubenswrapper[4903]: E0320 08:25:25.257338 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21d7057-af5a-499a-b71f-3cae88b6883e" containerName="route-controller-manager" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.257353 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21d7057-af5a-499a-b71f-3cae88b6883e" containerName="route-controller-manager" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.257439 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21d7057-af5a-499a-b71f-3cae88b6883e" containerName="route-controller-manager" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.259693 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.263718 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a21d7057-af5a-499a-b71f-3cae88b6883e-serving-cert\") pod \"a21d7057-af5a-499a-b71f-3cae88b6883e\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.263802 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsrjz\" (UniqueName: \"kubernetes.io/projected/a21d7057-af5a-499a-b71f-3cae88b6883e-kube-api-access-fsrjz\") pod \"a21d7057-af5a-499a-b71f-3cae88b6883e\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.263865 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-client-ca\") pod \"a21d7057-af5a-499a-b71f-3cae88b6883e\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.263905 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-config\") pod \"a21d7057-af5a-499a-b71f-3cae88b6883e\" (UID: \"a21d7057-af5a-499a-b71f-3cae88b6883e\") " Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.264321 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsb9h\" (UniqueName: \"kubernetes.io/projected/d209b7b2-53ad-4780-a13e-65d2b0cb5189-kube-api-access-zsb9h\") pod \"certified-operators-bmcws\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.264385 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-utilities\") pod \"certified-operators-bmcws\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.264440 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-catalog-content\") pod \"certified-operators-bmcws\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.265099 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-catalog-content\") pod \"certified-operators-bmcws\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.265886 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-utilities\") pod \"certified-operators-bmcws\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.267540 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-config" (OuterVolumeSpecName: "config") pod "a21d7057-af5a-499a-b71f-3cae88b6883e" (UID: "a21d7057-af5a-499a-b71f-3cae88b6883e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.267965 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-client-ca" (OuterVolumeSpecName: "client-ca") pod "a21d7057-af5a-499a-b71f-3cae88b6883e" (UID: "a21d7057-af5a-499a-b71f-3cae88b6883e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.277602 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l45gc"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.279170 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21d7057-af5a-499a-b71f-3cae88b6883e-kube-api-access-fsrjz" (OuterVolumeSpecName: "kube-api-access-fsrjz") pod "a21d7057-af5a-499a-b71f-3cae88b6883e" (UID: "a21d7057-af5a-499a-b71f-3cae88b6883e"). InnerVolumeSpecName "kube-api-access-fsrjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.280681 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21d7057-af5a-499a-b71f-3cae88b6883e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a21d7057-af5a-499a-b71f-3cae88b6883e" (UID: "a21d7057-af5a-499a-b71f-3cae88b6883e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.299872 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsb9h\" (UniqueName: \"kubernetes.io/projected/d209b7b2-53ad-4780-a13e-65d2b0cb5189-kube-api-access-zsb9h\") pod \"certified-operators-bmcws\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.344497 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4zl69" event={"ID":"8d670e7f-bbda-4168-87a6-baa6ce35177b","Type":"ContainerStarted","Data":"850c6fc8ea1854c946df6a4ac007dcd915d3e10ad2b915f269c060ea7a62546f"} Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.370336 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sdsc\" (UniqueName: \"kubernetes.io/projected/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-kube-api-access-8sdsc\") pod \"community-operators-l45gc\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.370955 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-utilities\") pod \"community-operators-l45gc\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.371006 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-catalog-content\") pod \"community-operators-l45gc\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.371116 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a21d7057-af5a-499a-b71f-3cae88b6883e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.371134 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsrjz\" (UniqueName: \"kubernetes.io/projected/a21d7057-af5a-499a-b71f-3cae88b6883e-kube-api-access-fsrjz\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.371147 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.371158 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a21d7057-af5a-499a-b71f-3cae88b6883e-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.378260 4903 generic.go:334] "Generic (PLEG): container finished" podID="a21d7057-af5a-499a-b71f-3cae88b6883e" containerID="bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2" exitCode=0 Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.378392 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" event={"ID":"a21d7057-af5a-499a-b71f-3cae88b6883e","Type":"ContainerDied","Data":"bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2"} Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.378429 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" event={"ID":"a21d7057-af5a-499a-b71f-3cae88b6883e","Type":"ContainerDied","Data":"f3359995e67a9748b6ec381b04f20f3ab3d6e184d8a99cd1d169a4187f66bd79"} Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.378448 4903 scope.go:117] "RemoveContainer" containerID="bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.378500 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.399999 4903 scope.go:117] "RemoveContainer" containerID="bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.400147 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" event={"ID":"578afe8a-2374-4893-bd24-5048cd759a3a","Type":"ContainerStarted","Data":"d1c598900da562b70b12508fbcfb161d0c9061ed5f14ce8fa9556d0d8c6126eb"} Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.401447 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4zl69" podStartSLOduration=12.401417792 podStartE2EDuration="12.401417792s" podCreationTimestamp="2026-03-20 08:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:25.378642443 +0000 UTC m=+150.595542758" watchObservedRunningTime="2026-03-20 08:25:25.401417792 +0000 UTC m=+150.618318107" Mar 20 08:25:25 crc kubenswrapper[4903]: E0320 08:25:25.403871 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2\": container with ID starting with bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2 not found: ID does not exist" containerID="bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.403989 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2"} err="failed to get container status \"bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2\": rpc error: code = NotFound desc = could not find container \"bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2\": container with ID starting with bcca73e2164d5fb033e0c7e0fd42f509f1de1aa3bffad8d68a6f2d89b3abfab2 not found: ID does not exist" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.407672 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2q9m5"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.422893 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.425614 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6bs4d"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.443968 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:25 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:25 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:25 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.444163 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.468638 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.476782 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sdsc\" (UniqueName: \"kubernetes.io/projected/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-kube-api-access-8sdsc\") pod \"community-operators-l45gc\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.477122 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-utilities\") pod \"community-operators-l45gc\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.477220 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-catalog-content\") pod \"community-operators-l45gc\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.489538 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-utilities\") pod \"community-operators-l45gc\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.490658 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.492650 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-catalog-content\") pod \"community-operators-l45gc\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.520407 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.521261 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01da421-9bf1-459f-a419-c7cc271bf472" path="/var/lib/kubelet/pods/a01da421-9bf1-459f-a419-c7cc271bf472/volumes" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.521843 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21d7057-af5a-499a-b71f-3cae88b6883e" path="/var/lib/kubelet/pods/a21d7057-af5a-499a-b71f-3cae88b6883e/volumes" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.523347 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sdsc\" (UniqueName: \"kubernetes.io/projected/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-kube-api-access-8sdsc\") pod \"community-operators-l45gc\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.551027 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fzcbm"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.586129 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nwkzx"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.601415 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:25:25 crc kubenswrapper[4903]: W0320 08:25:25.622029 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod738071fe_1a7b_403b_ab94_8e88d5d79ab4.slice/crio-37b0e18be858109c70513f4906f3c29dd57c8fdce0e75cec9785f4a9a5cf31b5 WatchSource:0}: Error finding container 37b0e18be858109c70513f4906f3c29dd57c8fdce0e75cec9785f4a9a5cf31b5: Status 404 returned error can't find the container with id 37b0e18be858109c70513f4906f3c29dd57c8fdce0e75cec9785f4a9a5cf31b5 Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.826263 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.850702 4903 ???:1] "http: TLS handshake error from 192.168.126.11:39412: no serving certificate available for the kubelet" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.858915 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmcws"] Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.886268 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.895419 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-fcqtt" Mar 20 08:25:25 crc kubenswrapper[4903]: W0320 08:25:25.916355 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd209b7b2_53ad_4780_a13e_65d2b0cb5189.slice/crio-5ac0e9bd1183ec46a5a16f923a7fe2d228f6830013ac90eaff5dbfdaa870bcf9 WatchSource:0}: Error finding container 5ac0e9bd1183ec46a5a16f923a7fe2d228f6830013ac90eaff5dbfdaa870bcf9: Status 404 returned error can't find the container with id 5ac0e9bd1183ec46a5a16f923a7fe2d228f6830013ac90eaff5dbfdaa870bcf9 Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.995888 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-config-volume\") pod \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.996007 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snlc5\" (UniqueName: \"kubernetes.io/projected/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-kube-api-access-snlc5\") pod \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.996263 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-secret-volume\") pod \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\" (UID: \"32d4db0f-b5bb-41d4-ae0d-0600e38892b1\") " Mar 20 08:25:25 crc kubenswrapper[4903]: I0320 08:25:25.997575 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-config-volume" (OuterVolumeSpecName: "config-volume") pod "32d4db0f-b5bb-41d4-ae0d-0600e38892b1" (UID: "32d4db0f-b5bb-41d4-ae0d-0600e38892b1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.013977 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "32d4db0f-b5bb-41d4-ae0d-0600e38892b1" (UID: "32d4db0f-b5bb-41d4-ae0d-0600e38892b1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.026451 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-kube-api-access-snlc5" (OuterVolumeSpecName: "kube-api-access-snlc5") pod "32d4db0f-b5bb-41d4-ae0d-0600e38892b1" (UID: "32d4db0f-b5bb-41d4-ae0d-0600e38892b1"). InnerVolumeSpecName "kube-api-access-snlc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.035634 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.035691 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.050235 4903 patch_prober.go:28] interesting pod/console-f9d7485db-h4m4s container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.050325 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-h4m4s" podUID="f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.100485 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.100562 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snlc5\" (UniqueName: \"kubernetes.io/projected/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-kube-api-access-snlc5\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.100596 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32d4db0f-b5bb-41d4-ae0d-0600e38892b1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:26 crc kubenswrapper[4903]: E0320 08:25:26.155327 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode74b75cf_cad8_4b67_91b7_3926096e09f8.slice/crio-b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf.scope\": RecentStats: unable to find data in memory cache]" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.172202 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l45gc"] Mar 20 08:25:26 crc kubenswrapper[4903]: W0320 08:25:26.187541 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefb0ecbf_eb11_4834_8e12_668b3b9f64c8.slice/crio-de3aa84a566c17025d39801f16bec43d719352f331a29e75419a16913103438e WatchSource:0}: Error finding container de3aa84a566c17025d39801f16bec43d719352f331a29e75419a16913103438e: Status 404 returned error can't find the container with id de3aa84a566c17025d39801f16bec43d719352f331a29e75419a16913103438e Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.345179 4903 patch_prober.go:28] interesting pod/downloads-7954f5f757-n4g5v container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.345596 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n4g5v" podUID="7c273a7a-00f1-4e28-a9a6-69d240f2df29" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.346403 4903 patch_prober.go:28] interesting pod/downloads-7954f5f757-n4g5v container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.346484 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-n4g5v" podUID="7c273a7a-00f1-4e28-a9a6-69d240f2df29" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.29:8080/\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.424548 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzcbm" event={"ID":"738071fe-1a7b-403b-ab94-8e88d5d79ab4","Type":"ContainerDied","Data":"73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.424495 4903 generic.go:334] "Generic (PLEG): container finished" podID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerID="73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3" exitCode=0 Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.425102 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzcbm" event={"ID":"738071fe-1a7b-403b-ab94-8e88d5d79ab4","Type":"ContainerStarted","Data":"37b0e18be858109c70513f4906f3c29dd57c8fdce0e75cec9785f4a9a5cf31b5"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.427319 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.436586 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" event={"ID":"32d4db0f-b5bb-41d4-ae0d-0600e38892b1","Type":"ContainerDied","Data":"86c5da429e3a63be1fb74c86e28e65220803d49d3c527722cb15b379e05853ac"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.436641 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86c5da429e3a63be1fb74c86e28e65220803d49d3c527722cb15b379e05853ac" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.436659 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.438020 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.449562 4903 generic.go:334] "Generic (PLEG): container finished" podID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerID="d78f2d3a79a13018ed4c3f7d273be342a36223b8970cbe7d118c4f455a83d264" exitCode=0 Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.449701 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmcws" event={"ID":"d209b7b2-53ad-4780-a13e-65d2b0cb5189","Type":"ContainerDied","Data":"d78f2d3a79a13018ed4c3f7d273be342a36223b8970cbe7d118c4f455a83d264"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.449741 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmcws" event={"ID":"d209b7b2-53ad-4780-a13e-65d2b0cb5189","Type":"ContainerStarted","Data":"5ac0e9bd1183ec46a5a16f923a7fe2d228f6830013ac90eaff5dbfdaa870bcf9"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.454611 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:26 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:26 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:26 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.454668 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.456262 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" event={"ID":"5778224c-9b34-45c0-9812-122b95cef431","Type":"ContainerStarted","Data":"038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.456325 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" event={"ID":"5778224c-9b34-45c0-9812-122b95cef431","Type":"ContainerStarted","Data":"cba04f9c202f29f8045f83f8d6083d98104dfccd87e37c8517e2277fbbc59e2b"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.457192 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.461116 4903 generic.go:334] "Generic (PLEG): container finished" podID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerID="b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf" exitCode=0 Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.461182 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nwkzx" event={"ID":"e74b75cf-cad8-4b67-91b7-3926096e09f8","Type":"ContainerDied","Data":"b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.461207 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nwkzx" event={"ID":"e74b75cf-cad8-4b67-91b7-3926096e09f8","Type":"ContainerStarted","Data":"68332149b84360bcd17d633c1dff27018b738465a9211ebce7d6c3a39420e93f"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.471711 4903 generic.go:334] "Generic (PLEG): container finished" podID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerID="1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e" exitCode=0 Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.472061 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45gc" event={"ID":"efb0ecbf-eb11-4834-8e12-668b3b9f64c8","Type":"ContainerDied","Data":"1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.472118 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45gc" event={"ID":"efb0ecbf-eb11-4834-8e12-668b3b9f64c8","Type":"ContainerStarted","Data":"de3aa84a566c17025d39801f16bec43d719352f331a29e75419a16913103438e"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.474449 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" event={"ID":"578afe8a-2374-4893-bd24-5048cd759a3a","Type":"ContainerStarted","Data":"fd8ad6243d7f25570b08e5fd90182ed4334965819f599d35071384d3269ac63d"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.475970 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.482245 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa79c2ee-4638-4400-8644-3b28f91e8497","Type":"ContainerStarted","Data":"e2a912aa0dc4a0a669e8073b86b070886cbdebda26dbf2770fc037e07e9b1804"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.482286 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa79c2ee-4638-4400-8644-3b28f91e8497","Type":"ContainerStarted","Data":"62623776deec6c8c693d4fe43e826f9ab541b630ed9986b782d6c7d8122b2aab"} Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.487474 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.491526 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:25:26 crc kubenswrapper[4903]: E0320 08:25:26.491792 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.552455 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" podStartSLOduration=3.552433531 podStartE2EDuration="3.552433531s" podCreationTimestamp="2026-03-20 08:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:26.549822328 +0000 UTC m=+151.766722643" watchObservedRunningTime="2026-03-20 08:25:26.552433531 +0000 UTC m=+151.769333846" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.610207 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.610186848 podStartE2EDuration="2.610186848s" podCreationTimestamp="2026-03-20 08:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:26.585781494 +0000 UTC m=+151.802681809" watchObservedRunningTime="2026-03-20 08:25:26.610186848 +0000 UTC m=+151.827087173" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.610909 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" podStartSLOduration=89.610903657 podStartE2EDuration="1m29.610903657s" podCreationTimestamp="2026-03-20 08:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:26.607749029 +0000 UTC m=+151.824649344" watchObservedRunningTime="2026-03-20 08:25:26.610903657 +0000 UTC m=+151.827803972" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.857079 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cxn5d"] Mar 20 08:25:26 crc kubenswrapper[4903]: E0320 08:25:26.857526 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d4db0f-b5bb-41d4-ae0d-0600e38892b1" containerName="collect-profiles" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.857646 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d4db0f-b5bb-41d4-ae0d-0600e38892b1" containerName="collect-profiles" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.857822 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d4db0f-b5bb-41d4-ae0d-0600e38892b1" containerName="collect-profiles" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.858627 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.861229 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.876275 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxn5d"] Mar 20 08:25:26 crc kubenswrapper[4903]: E0320 08:25:26.876573 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:25:26 crc kubenswrapper[4903]: E0320 08:25:26.884488 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:25:26 crc kubenswrapper[4903]: E0320 08:25:26.886321 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:25:26 crc kubenswrapper[4903]: E0320 08:25:26.886540 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" podUID="433bd27c-a67a-4487-b09e-523fd9b34b8f" containerName="kube-multus-additional-cni-plugins" Mar 20 08:25:26 crc kubenswrapper[4903]: I0320 08:25:26.907391 4903 ???:1] "http: TLS handshake error from 192.168.126.11:39426: no serving certificate available for the kubelet" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.014876 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-catalog-content\") pod \"redhat-marketplace-cxn5d\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.014958 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42szb\" (UniqueName: \"kubernetes.io/projected/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-kube-api-access-42szb\") pod \"redhat-marketplace-cxn5d\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.014987 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-utilities\") pod \"redhat-marketplace-cxn5d\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.116445 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42szb\" (UniqueName: \"kubernetes.io/projected/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-kube-api-access-42szb\") pod \"redhat-marketplace-cxn5d\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.116510 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-utilities\") pod \"redhat-marketplace-cxn5d\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.116625 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-catalog-content\") pod \"redhat-marketplace-cxn5d\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.117126 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-catalog-content\") pod \"redhat-marketplace-cxn5d\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.117277 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-utilities\") pod \"redhat-marketplace-cxn5d\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.137717 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42szb\" (UniqueName: \"kubernetes.io/projected/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-kube-api-access-42szb\") pod \"redhat-marketplace-cxn5d\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.204917 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.205914 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.209293 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.209582 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.211459 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.223433 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.260756 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-crf95"] Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.262211 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.316056 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-crf95"] Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.323918 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pd2r\" (UniqueName: \"kubernetes.io/projected/85039fed-a0e7-4cea-834c-930d1c9974a1-kube-api-access-7pd2r\") pod \"redhat-marketplace-crf95\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.323986 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-catalog-content\") pod \"redhat-marketplace-crf95\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.324092 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.324190 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-utilities\") pod \"redhat-marketplace-crf95\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.324245 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.359274 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92"] Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.360566 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.366513 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.366732 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.367391 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.369457 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.369609 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.371113 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.374514 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92"] Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.425464 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.425527 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-utilities\") pod \"redhat-marketplace-crf95\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.425557 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.425603 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-client-ca\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.425626 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.425640 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pd2r\" (UniqueName: \"kubernetes.io/projected/85039fed-a0e7-4cea-834c-930d1c9974a1-kube-api-access-7pd2r\") pod \"redhat-marketplace-crf95\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.425762 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-serving-cert\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.425813 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-catalog-content\") pod \"redhat-marketplace-crf95\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.425942 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-config\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.426079 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j5zd\" (UniqueName: \"kubernetes.io/projected/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-kube-api-access-9j5zd\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.426421 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-catalog-content\") pod \"redhat-marketplace-crf95\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.426910 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-utilities\") pod \"redhat-marketplace-crf95\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.441973 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:27 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:27 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:27 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.442078 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.461108 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.461349 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pd2r\" (UniqueName: \"kubernetes.io/projected/85039fed-a0e7-4cea-834c-930d1c9974a1-kube-api-access-7pd2r\") pod \"redhat-marketplace-crf95\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.512112 4903 generic.go:334] "Generic (PLEG): container finished" podID="fa79c2ee-4638-4400-8644-3b28f91e8497" containerID="e2a912aa0dc4a0a669e8073b86b070886cbdebda26dbf2770fc037e07e9b1804" exitCode=0 Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.513750 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa79c2ee-4638-4400-8644-3b28f91e8497","Type":"ContainerDied","Data":"e2a912aa0dc4a0a669e8073b86b070886cbdebda26dbf2770fc037e07e9b1804"} Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.527355 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-serving-cert\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.527429 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-config\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.527457 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j5zd\" (UniqueName: \"kubernetes.io/projected/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-kube-api-access-9j5zd\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.527525 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-client-ca\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.529105 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-client-ca\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.529478 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-config\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.550081 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-serving-cert\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.553346 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j5zd\" (UniqueName: \"kubernetes.io/projected/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-kube-api-access-9j5zd\") pod \"route-controller-manager-d9b799b66-r8c92\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.589488 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.632809 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.693016 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.847929 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxn5d"] Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.861945 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pbz9z"] Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.863423 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.871306 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 08:25:27 crc kubenswrapper[4903]: W0320 08:25:27.879272 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48da3e1a_ed3d_4048_8f10_39f1cc56d9af.slice/crio-3346aa5a19efac517e2753f55fdb4fad1a983faed5b094620f977aba39c61c7d WatchSource:0}: Error finding container 3346aa5a19efac517e2753f55fdb4fad1a983faed5b094620f977aba39c61c7d: Status 404 returned error can't find the container with id 3346aa5a19efac517e2753f55fdb4fad1a983faed5b094620f977aba39c61c7d Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.895345 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbz9z"] Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.946626 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsp9w\" (UniqueName: \"kubernetes.io/projected/37ce866b-65c1-454a-b346-43c2ebe9a2e0-kube-api-access-zsp9w\") pod \"redhat-operators-pbz9z\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.946679 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-catalog-content\") pod \"redhat-operators-pbz9z\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:27 crc kubenswrapper[4903]: I0320 08:25:27.946712 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-utilities\") pod \"redhat-operators-pbz9z\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.001962 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.048758 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsp9w\" (UniqueName: \"kubernetes.io/projected/37ce866b-65c1-454a-b346-43c2ebe9a2e0-kube-api-access-zsp9w\") pod \"redhat-operators-pbz9z\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.048803 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-catalog-content\") pod \"redhat-operators-pbz9z\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.048844 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-utilities\") pod \"redhat-operators-pbz9z\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.049786 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-utilities\") pod \"redhat-operators-pbz9z\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.050524 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-catalog-content\") pod \"redhat-operators-pbz9z\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.073938 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsp9w\" (UniqueName: \"kubernetes.io/projected/37ce866b-65c1-454a-b346-43c2ebe9a2e0-kube-api-access-zsp9w\") pod \"redhat-operators-pbz9z\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.186744 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.285986 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cbqcg"] Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.287642 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.289512 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cbqcg"] Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.353900 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92"] Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.355869 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-utilities\") pod \"redhat-operators-cbqcg\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.355964 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-catalog-content\") pod \"redhat-operators-cbqcg\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.355991 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxf2k\" (UniqueName: \"kubernetes.io/projected/e6a63c01-6cfe-4d24-835a-4fa810111888-kube-api-access-fxf2k\") pod \"redhat-operators-cbqcg\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.412244 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-crf95"] Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.460213 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-catalog-content\") pod \"redhat-operators-cbqcg\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.460282 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxf2k\" (UniqueName: \"kubernetes.io/projected/e6a63c01-6cfe-4d24-835a-4fa810111888-kube-api-access-fxf2k\") pod \"redhat-operators-cbqcg\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.460374 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-utilities\") pod \"redhat-operators-cbqcg\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.461166 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-utilities\") pod \"redhat-operators-cbqcg\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.462755 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-catalog-content\") pod \"redhat-operators-cbqcg\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.479395 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:28 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:28 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:28 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.479472 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.491673 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxf2k\" (UniqueName: \"kubernetes.io/projected/e6a63c01-6cfe-4d24-835a-4fa810111888-kube-api-access-fxf2k\") pod \"redhat-operators-cbqcg\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.538245 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9","Type":"ContainerStarted","Data":"18f10e4149f0ab7c38d3c4151be53b4017783d549eaa301cb481b38e9e0d1d45"} Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.542897 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" event={"ID":"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4","Type":"ContainerStarted","Data":"f8b916bb1f3c2d92fb2387cf398026bd513505cf70d0294ea3cc8604592fc139"} Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.545438 4903 generic.go:334] "Generic (PLEG): container finished" podID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerID="968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c" exitCode=0 Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.545497 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxn5d" event={"ID":"48da3e1a-ed3d-4048-8f10-39f1cc56d9af","Type":"ContainerDied","Data":"968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c"} Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.545519 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxn5d" event={"ID":"48da3e1a-ed3d-4048-8f10-39f1cc56d9af","Type":"ContainerStarted","Data":"3346aa5a19efac517e2753f55fdb4fad1a983faed5b094620f977aba39c61c7d"} Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.552796 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crf95" event={"ID":"85039fed-a0e7-4cea-834c-930d1c9974a1","Type":"ContainerStarted","Data":"8e0d0af934a639abaa441d8a4bdd8bd1bd069d64fe460efa085ff1d4f2d56ec1"} Mar 20 08:25:28 crc kubenswrapper[4903]: I0320 08:25:28.659706 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.007163 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbz9z"] Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.020876 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.090301 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa79c2ee-4638-4400-8644-3b28f91e8497-kube-api-access\") pod \"fa79c2ee-4638-4400-8644-3b28f91e8497\" (UID: \"fa79c2ee-4638-4400-8644-3b28f91e8497\") " Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.090894 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa79c2ee-4638-4400-8644-3b28f91e8497-kubelet-dir\") pod \"fa79c2ee-4638-4400-8644-3b28f91e8497\" (UID: \"fa79c2ee-4638-4400-8644-3b28f91e8497\") " Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.091504 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa79c2ee-4638-4400-8644-3b28f91e8497-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fa79c2ee-4638-4400-8644-3b28f91e8497" (UID: "fa79c2ee-4638-4400-8644-3b28f91e8497"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.104313 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa79c2ee-4638-4400-8644-3b28f91e8497-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fa79c2ee-4638-4400-8644-3b28f91e8497" (UID: "fa79c2ee-4638-4400-8644-3b28f91e8497"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.193808 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa79c2ee-4638-4400-8644-3b28f91e8497-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.193841 4903 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa79c2ee-4638-4400-8644-3b28f91e8497-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.279618 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cbqcg"] Mar 20 08:25:29 crc kubenswrapper[4903]: W0320 08:25:29.340371 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6a63c01_6cfe_4d24_835a_4fa810111888.slice/crio-94281610e7b9e9db7024933c01def7e03f9b7ac2d9abd6bd950c16ac4abdc6b5 WatchSource:0}: Error finding container 94281610e7b9e9db7024933c01def7e03f9b7ac2d9abd6bd950c16ac4abdc6b5: Status 404 returned error can't find the container with id 94281610e7b9e9db7024933c01def7e03f9b7ac2d9abd6bd950c16ac4abdc6b5 Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.450318 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:29 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:29 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:29 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.450813 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.610613 4903 generic.go:334] "Generic (PLEG): container finished" podID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerID="a25b642b782e36bf09323533a2def2c5ae319466a12e22f89427181512ca2c00" exitCode=0 Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.610702 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crf95" event={"ID":"85039fed-a0e7-4cea-834c-930d1c9974a1","Type":"ContainerDied","Data":"a25b642b782e36bf09323533a2def2c5ae319466a12e22f89427181512ca2c00"} Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.620376 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fa79c2ee-4638-4400-8644-3b28f91e8497","Type":"ContainerDied","Data":"62623776deec6c8c693d4fe43e826f9ab541b630ed9986b782d6c7d8122b2aab"} Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.620423 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62623776deec6c8c693d4fe43e826f9ab541b630ed9986b782d6c7d8122b2aab" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.620501 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.636398 4903 generic.go:334] "Generic (PLEG): container finished" podID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerID="30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b" exitCode=0 Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.636504 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbz9z" event={"ID":"37ce866b-65c1-454a-b346-43c2ebe9a2e0","Type":"ContainerDied","Data":"30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b"} Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.636542 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbz9z" event={"ID":"37ce866b-65c1-454a-b346-43c2ebe9a2e0","Type":"ContainerStarted","Data":"d3a9f3b1a68766ea5ac0beaa0774e29abd5ac8b5420dc1e3856a7ec473ec54bc"} Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.650337 4903 generic.go:334] "Generic (PLEG): container finished" podID="6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9" containerID="6a90506fbafc3e923e6ff07a13d9aa7c11905eb61657400652ad1963b1e60c49" exitCode=0 Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.650485 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9","Type":"ContainerDied","Data":"6a90506fbafc3e923e6ff07a13d9aa7c11905eb61657400652ad1963b1e60c49"} Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.654226 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbqcg" event={"ID":"e6a63c01-6cfe-4d24-835a-4fa810111888","Type":"ContainerStarted","Data":"94281610e7b9e9db7024933c01def7e03f9b7ac2d9abd6bd950c16ac4abdc6b5"} Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.663429 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" event={"ID":"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4","Type":"ContainerStarted","Data":"bc6dd320320c485629531c527a64044601a8fab7d36498f5627d22725d511d21"} Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.664919 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.735099 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" podStartSLOduration=6.735076497 podStartE2EDuration="6.735076497s" podCreationTimestamp="2026-03-20 08:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:29.730531219 +0000 UTC m=+154.947431534" watchObservedRunningTime="2026-03-20 08:25:29.735076497 +0000 UTC m=+154.951976812" Mar 20 08:25:29 crc kubenswrapper[4903]: I0320 08:25:29.774272 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:30 crc kubenswrapper[4903]: I0320 08:25:30.449659 4903 patch_prober.go:28] interesting pod/router-default-5444994796-cc4f6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 08:25:30 crc kubenswrapper[4903]: [-]has-synced failed: reason withheld Mar 20 08:25:30 crc kubenswrapper[4903]: [+]process-running ok Mar 20 08:25:30 crc kubenswrapper[4903]: healthz check failed Mar 20 08:25:30 crc kubenswrapper[4903]: I0320 08:25:30.449743 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-cc4f6" podUID="0b0221f5-84f8-47b9-bab3-934c1890fb2f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:25:30 crc kubenswrapper[4903]: I0320 08:25:30.691207 4903 generic.go:334] "Generic (PLEG): container finished" podID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerID="8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657" exitCode=0 Mar 20 08:25:30 crc kubenswrapper[4903]: I0320 08:25:30.691551 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbqcg" event={"ID":"e6a63c01-6cfe-4d24-835a-4fa810111888","Type":"ContainerDied","Data":"8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657"} Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.031623 4903 ???:1] "http: TLS handshake error from 192.168.126.11:39432: no serving certificate available for the kubelet" Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.102086 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.165371 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9" (UID: "6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.165282 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kubelet-dir\") pod \"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9\" (UID: \"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9\") " Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.165900 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kube-api-access\") pod \"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9\" (UID: \"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9\") " Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.168007 4903 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.180812 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9" (UID: "6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.269307 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.456193 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.470274 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-cc4f6" Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.713130 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.713489 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9","Type":"ContainerDied","Data":"18f10e4149f0ab7c38d3c4151be53b4017783d549eaa301cb481b38e9e0d1d45"} Mar 20 08:25:31 crc kubenswrapper[4903]: I0320 08:25:31.713512 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18f10e4149f0ab7c38d3c4151be53b4017783d549eaa301cb481b38e9e0d1d45" Mar 20 08:25:32 crc kubenswrapper[4903]: I0320 08:25:32.166668 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vc9fr" Mar 20 08:25:36 crc kubenswrapper[4903]: I0320 08:25:36.083932 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:36 crc kubenswrapper[4903]: I0320 08:25:36.106095 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:25:36 crc kubenswrapper[4903]: I0320 08:25:36.353299 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-n4g5v" Mar 20 08:25:36 crc kubenswrapper[4903]: E0320 08:25:36.879958 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:25:36 crc kubenswrapper[4903]: E0320 08:25:36.885414 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:25:36 crc kubenswrapper[4903]: E0320 08:25:36.889012 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:25:36 crc kubenswrapper[4903]: E0320 08:25:36.889183 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" podUID="433bd27c-a67a-4487-b09e-523fd9b34b8f" containerName="kube-multus-additional-cni-plugins" Mar 20 08:25:37 crc kubenswrapper[4903]: I0320 08:25:37.493547 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:25:37 crc kubenswrapper[4903]: E0320 08:25:37.494713 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:25:41 crc kubenswrapper[4903]: I0320 08:25:41.513739 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 08:25:42 crc kubenswrapper[4903]: I0320 08:25:42.412613 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d874b96b-vslql"] Mar 20 08:25:42 crc kubenswrapper[4903]: I0320 08:25:42.413456 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" podUID="578afe8a-2374-4893-bd24-5048cd759a3a" containerName="controller-manager" containerID="cri-o://fd8ad6243d7f25570b08e5fd90182ed4334965819f599d35071384d3269ac63d" gracePeriod=30 Mar 20 08:25:42 crc kubenswrapper[4903]: I0320 08:25:42.418456 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92"] Mar 20 08:25:42 crc kubenswrapper[4903]: I0320 08:25:42.418780 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" podUID="8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" containerName="route-controller-manager" containerID="cri-o://bc6dd320320c485629531c527a64044601a8fab7d36498f5627d22725d511d21" gracePeriod=30 Mar 20 08:25:42 crc kubenswrapper[4903]: I0320 08:25:42.440625 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.440604692 podStartE2EDuration="1.440604692s" podCreationTimestamp="2026-03-20 08:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:42.435721686 +0000 UTC m=+167.652622061" watchObservedRunningTime="2026-03-20 08:25:42.440604692 +0000 UTC m=+167.657505007" Mar 20 08:25:43 crc kubenswrapper[4903]: I0320 08:25:43.827956 4903 generic.go:334] "Generic (PLEG): container finished" podID="578afe8a-2374-4893-bd24-5048cd759a3a" containerID="fd8ad6243d7f25570b08e5fd90182ed4334965819f599d35071384d3269ac63d" exitCode=0 Mar 20 08:25:43 crc kubenswrapper[4903]: I0320 08:25:43.828071 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" event={"ID":"578afe8a-2374-4893-bd24-5048cd759a3a","Type":"ContainerDied","Data":"fd8ad6243d7f25570b08e5fd90182ed4334965819f599d35071384d3269ac63d"} Mar 20 08:25:43 crc kubenswrapper[4903]: I0320 08:25:43.829916 4903 generic.go:334] "Generic (PLEG): container finished" podID="8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" containerID="bc6dd320320c485629531c527a64044601a8fab7d36498f5627d22725d511d21" exitCode=0 Mar 20 08:25:43 crc kubenswrapper[4903]: I0320 08:25:43.829950 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" event={"ID":"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4","Type":"ContainerDied","Data":"bc6dd320320c485629531c527a64044601a8fab7d36498f5627d22725d511d21"} Mar 20 08:25:44 crc kubenswrapper[4903]: I0320 08:25:44.692303 4903 patch_prober.go:28] interesting pod/controller-manager-6d874b96b-vslql container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" start-of-body= Mar 20 08:25:44 crc kubenswrapper[4903]: I0320 08:25:44.692380 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" podUID="578afe8a-2374-4893-bd24-5048cd759a3a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" Mar 20 08:25:45 crc kubenswrapper[4903]: I0320 08:25:45.082847 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:25:46 crc kubenswrapper[4903]: E0320 08:25:46.874437 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:25:46 crc kubenswrapper[4903]: E0320 08:25:46.877482 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:25:46 crc kubenswrapper[4903]: E0320 08:25:46.879925 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 08:25:46 crc kubenswrapper[4903]: E0320 08:25:46.880006 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" podUID="433bd27c-a67a-4487-b09e-523fd9b34b8f" containerName="kube-multus-additional-cni-plugins" Mar 20 08:25:47 crc kubenswrapper[4903]: I0320 08:25:47.696714 4903 patch_prober.go:28] interesting pod/route-controller-manager-d9b799b66-r8c92 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 20 08:25:47 crc kubenswrapper[4903]: I0320 08:25:47.697192 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" podUID="8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 20 08:25:51 crc kubenswrapper[4903]: I0320 08:25:51.541133 4903 ???:1] "http: TLS handshake error from 192.168.126.11:55468: no serving certificate available for the kubelet" Mar 20 08:25:52 crc kubenswrapper[4903]: I0320 08:25:52.491126 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:25:52 crc kubenswrapper[4903]: E0320 08:25:52.492329 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.896926 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" event={"ID":"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4","Type":"ContainerDied","Data":"f8b916bb1f3c2d92fb2387cf398026bd513505cf70d0294ea3cc8604592fc139"} Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.897538 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8b916bb1f3c2d92fb2387cf398026bd513505cf70d0294ea3cc8604592fc139" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.900787 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-spjl5_433bd27c-a67a-4487-b09e-523fd9b34b8f/kube-multus-additional-cni-plugins/0.log" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.900840 4903 generic.go:334] "Generic (PLEG): container finished" podID="433bd27c-a67a-4487-b09e-523fd9b34b8f" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" exitCode=137 Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.900869 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" event={"ID":"433bd27c-a67a-4487-b09e-523fd9b34b8f","Type":"ContainerDied","Data":"1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867"} Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.910793 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.961445 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j5zd\" (UniqueName: \"kubernetes.io/projected/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-kube-api-access-9j5zd\") pod \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.961528 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-config\") pod \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.961553 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-client-ca\") pod \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.961633 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-serving-cert\") pod \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\" (UID: \"8b3f009c-5cb8-4f58-9f0c-8c1e165823a4\") " Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.962742 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-client-ca" (OuterVolumeSpecName: "client-ca") pod "8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" (UID: "8b3f009c-5cb8-4f58-9f0c-8c1e165823a4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.962786 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-config" (OuterVolumeSpecName: "config") pod "8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" (UID: "8b3f009c-5cb8-4f58-9f0c-8c1e165823a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.965404 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt"] Mar 20 08:25:53 crc kubenswrapper[4903]: E0320 08:25:53.965809 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa79c2ee-4638-4400-8644-3b28f91e8497" containerName="pruner" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.965841 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa79c2ee-4638-4400-8644-3b28f91e8497" containerName="pruner" Mar 20 08:25:53 crc kubenswrapper[4903]: E0320 08:25:53.965875 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9" containerName="pruner" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.965887 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9" containerName="pruner" Mar 20 08:25:53 crc kubenswrapper[4903]: E0320 08:25:53.967784 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" containerName="route-controller-manager" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.967812 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" containerName="route-controller-manager" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.967947 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fde2d8e-b4fc-4ea3-ac92-cca05cb311f9" containerName="pruner" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.967978 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" containerName="route-controller-manager" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.981732 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-kube-api-access-9j5zd" (OuterVolumeSpecName: "kube-api-access-9j5zd") pod "8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" (UID: "8b3f009c-5cb8-4f58-9f0c-8c1e165823a4"). InnerVolumeSpecName "kube-api-access-9j5zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.981873 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa79c2ee-4638-4400-8644-3b28f91e8497" containerName="pruner" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.982971 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.989276 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" (UID: "8b3f009c-5cb8-4f58-9f0c-8c1e165823a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:25:53 crc kubenswrapper[4903]: I0320 08:25:53.992128 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt"] Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.063487 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-config\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.063585 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2xbv\" (UniqueName: \"kubernetes.io/projected/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-kube-api-access-d2xbv\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.063619 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-client-ca\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.063648 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-serving-cert\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.063701 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.063717 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j5zd\" (UniqueName: \"kubernetes.io/projected/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-kube-api-access-9j5zd\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.063727 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.063735 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.166008 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-config\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.166104 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2xbv\" (UniqueName: \"kubernetes.io/projected/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-kube-api-access-d2xbv\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.166144 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-client-ca\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.166182 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-serving-cert\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.167308 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-client-ca\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.167405 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-config\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.170206 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-serving-cert\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.186886 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2xbv\" (UniqueName: \"kubernetes.io/projected/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-kube-api-access-d2xbv\") pod \"route-controller-manager-585d79454d-jxlmt\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.324387 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.691264 4903 patch_prober.go:28] interesting pod/controller-manager-6d874b96b-vslql container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" start-of-body= Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.691703 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" podUID="578afe8a-2374-4893-bd24-5048cd759a3a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.44:8443/healthz\": dial tcp 10.217.0.44:8443: connect: connection refused" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.908781 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92" Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.950311 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92"] Mar 20 08:25:54 crc kubenswrapper[4903]: I0320 08:25:54.953750 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d9b799b66-r8c92"] Mar 20 08:25:55 crc kubenswrapper[4903]: I0320 08:25:55.500546 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3f009c-5cb8-4f58-9f0c-8c1e165823a4" path="/var/lib/kubelet/pods/8b3f009c-5cb8-4f58-9f0c-8c1e165823a4/volumes" Mar 20 08:25:55 crc kubenswrapper[4903]: I0320 08:25:55.981133 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-spjl5_433bd27c-a67a-4487-b09e-523fd9b34b8f/kube-multus-additional-cni-plugins/0.log" Mar 20 08:25:55 crc kubenswrapper[4903]: I0320 08:25:55.981499 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:56 crc kubenswrapper[4903]: E0320 08:25:56.046606 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 08:25:56 crc kubenswrapper[4903]: E0320 08:25:56.046893 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g98zj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nwkzx_openshift-marketplace(e74b75cf-cad8-4b67-91b7-3926096e09f8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 08:25:56 crc kubenswrapper[4903]: E0320 08:25:56.048188 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nwkzx" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" Mar 20 08:25:56 crc kubenswrapper[4903]: E0320 08:25:56.063321 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 08:25:56 crc kubenswrapper[4903]: E0320 08:25:56.063469 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sdsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l45gc_openshift-marketplace(efb0ecbf-eb11-4834-8e12-668b3b9f64c8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 08:25:56 crc kubenswrapper[4903]: E0320 08:25:56.064870 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l45gc" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.099378 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/433bd27c-a67a-4487-b09e-523fd9b34b8f-tuning-conf-dir\") pod \"433bd27c-a67a-4487-b09e-523fd9b34b8f\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.099465 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlflt\" (UniqueName: \"kubernetes.io/projected/433bd27c-a67a-4487-b09e-523fd9b34b8f-kube-api-access-jlflt\") pod \"433bd27c-a67a-4487-b09e-523fd9b34b8f\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.099535 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/433bd27c-a67a-4487-b09e-523fd9b34b8f-ready\") pod \"433bd27c-a67a-4487-b09e-523fd9b34b8f\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.099559 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/433bd27c-a67a-4487-b09e-523fd9b34b8f-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "433bd27c-a67a-4487-b09e-523fd9b34b8f" (UID: "433bd27c-a67a-4487-b09e-523fd9b34b8f"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.099587 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/433bd27c-a67a-4487-b09e-523fd9b34b8f-cni-sysctl-allowlist\") pod \"433bd27c-a67a-4487-b09e-523fd9b34b8f\" (UID: \"433bd27c-a67a-4487-b09e-523fd9b34b8f\") " Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.099867 4903 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/433bd27c-a67a-4487-b09e-523fd9b34b8f-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.099971 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/433bd27c-a67a-4487-b09e-523fd9b34b8f-ready" (OuterVolumeSpecName: "ready") pod "433bd27c-a67a-4487-b09e-523fd9b34b8f" (UID: "433bd27c-a67a-4487-b09e-523fd9b34b8f"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.100502 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/433bd27c-a67a-4487-b09e-523fd9b34b8f-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "433bd27c-a67a-4487-b09e-523fd9b34b8f" (UID: "433bd27c-a67a-4487-b09e-523fd9b34b8f"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.113694 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/433bd27c-a67a-4487-b09e-523fd9b34b8f-kube-api-access-jlflt" (OuterVolumeSpecName: "kube-api-access-jlflt") pod "433bd27c-a67a-4487-b09e-523fd9b34b8f" (UID: "433bd27c-a67a-4487-b09e-523fd9b34b8f"). InnerVolumeSpecName "kube-api-access-jlflt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.114855 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.202428 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-client-ca\") pod \"578afe8a-2374-4893-bd24-5048cd759a3a\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.202481 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578afe8a-2374-4893-bd24-5048cd759a3a-serving-cert\") pod \"578afe8a-2374-4893-bd24-5048cd759a3a\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.202570 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-config\") pod \"578afe8a-2374-4893-bd24-5048cd759a3a\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.202616 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j9lj\" (UniqueName: \"kubernetes.io/projected/578afe8a-2374-4893-bd24-5048cd759a3a-kube-api-access-9j9lj\") pod \"578afe8a-2374-4893-bd24-5048cd759a3a\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.202726 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-proxy-ca-bundles\") pod \"578afe8a-2374-4893-bd24-5048cd759a3a\" (UID: \"578afe8a-2374-4893-bd24-5048cd759a3a\") " Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.203001 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlflt\" (UniqueName: \"kubernetes.io/projected/433bd27c-a67a-4487-b09e-523fd9b34b8f-kube-api-access-jlflt\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.203014 4903 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/433bd27c-a67a-4487-b09e-523fd9b34b8f-ready\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.203025 4903 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/433bd27c-a67a-4487-b09e-523fd9b34b8f-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.204289 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "578afe8a-2374-4893-bd24-5048cd759a3a" (UID: "578afe8a-2374-4893-bd24-5048cd759a3a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.204737 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-client-ca" (OuterVolumeSpecName: "client-ca") pod "578afe8a-2374-4893-bd24-5048cd759a3a" (UID: "578afe8a-2374-4893-bd24-5048cd759a3a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.206742 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-config" (OuterVolumeSpecName: "config") pod "578afe8a-2374-4893-bd24-5048cd759a3a" (UID: "578afe8a-2374-4893-bd24-5048cd759a3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.223270 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/578afe8a-2374-4893-bd24-5048cd759a3a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "578afe8a-2374-4893-bd24-5048cd759a3a" (UID: "578afe8a-2374-4893-bd24-5048cd759a3a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.225174 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578afe8a-2374-4893-bd24-5048cd759a3a-kube-api-access-9j9lj" (OuterVolumeSpecName: "kube-api-access-9j9lj") pod "578afe8a-2374-4893-bd24-5048cd759a3a" (UID: "578afe8a-2374-4893-bd24-5048cd759a3a"). InnerVolumeSpecName "kube-api-access-9j9lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.299331 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt"] Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.304769 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.304796 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.304806 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578afe8a-2374-4893-bd24-5048cd759a3a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.304816 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578afe8a-2374-4893-bd24-5048cd759a3a-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.304829 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j9lj\" (UniqueName: \"kubernetes.io/projected/578afe8a-2374-4893-bd24-5048cd759a3a-kube-api-access-9j9lj\") on node \"crc\" DevicePath \"\"" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.458356 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x8c9s" Mar 20 08:25:56 crc kubenswrapper[4903]: E0320 08:25:56.598551 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod738071fe_1a7b_403b_ab94_8e88d5d79ab4.slice/crio-7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd209b7b2_53ad_4780_a13e_65d2b0cb5189.slice/crio-conmon-a4f90216b0c31bdff22706d0d8399c18d91e85da60af0917c1c404451d5382b1.scope\": RecentStats: unable to find data in memory cache]" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.926156 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.926265 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d874b96b-vslql" event={"ID":"578afe8a-2374-4893-bd24-5048cd759a3a","Type":"ContainerDied","Data":"d1c598900da562b70b12508fbcfb161d0c9061ed5f14ce8fa9556d0d8c6126eb"} Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.926376 4903 scope.go:117] "RemoveContainer" containerID="fd8ad6243d7f25570b08e5fd90182ed4334965819f599d35071384d3269ac63d" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.928679 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-spjl5_433bd27c-a67a-4487-b09e-523fd9b34b8f/kube-multus-additional-cni-plugins/0.log" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.928764 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" event={"ID":"433bd27c-a67a-4487-b09e-523fd9b34b8f","Type":"ContainerDied","Data":"61ad0ea74b98989627f624305ad7c0747e9d1d6db3ce941d9a353588a1af634c"} Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.928833 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-spjl5" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.933519 4903 generic.go:334] "Generic (PLEG): container finished" podID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerID="a4f90216b0c31bdff22706d0d8399c18d91e85da60af0917c1c404451d5382b1" exitCode=0 Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.933673 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmcws" event={"ID":"d209b7b2-53ad-4780-a13e-65d2b0cb5189","Type":"ContainerDied","Data":"a4f90216b0c31bdff22706d0d8399c18d91e85da60af0917c1c404451d5382b1"} Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.958602 4903 scope.go:117] "RemoveContainer" containerID="1ba01392693cb27f963e41acb1cb1af6da61169459877cc07eb4972d80391867" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.965382 4903 generic.go:334] "Generic (PLEG): container finished" podID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerID="3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609" exitCode=0 Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.965502 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxn5d" event={"ID":"48da3e1a-ed3d-4048-8f10-39f1cc56d9af","Type":"ContainerDied","Data":"3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609"} Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.978593 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" event={"ID":"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e","Type":"ContainerStarted","Data":"88c6e5b0a5d400533e31429ee73c5bab50d41296c6051bf8940c97e9f3316db6"} Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.978656 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" event={"ID":"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e","Type":"ContainerStarted","Data":"f5affc269af10dccdc47aaa65e111afd3cdb1b0988d5d93275ab75aa0543f34f"} Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.978909 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.980907 4903 generic.go:334] "Generic (PLEG): container finished" podID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerID="7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601" exitCode=0 Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.981083 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzcbm" event={"ID":"738071fe-1a7b-403b-ab94-8e88d5d79ab4","Type":"ContainerDied","Data":"7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601"} Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.985767 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:25:56 crc kubenswrapper[4903]: I0320 08:25:56.997396 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbz9z" event={"ID":"37ce866b-65c1-454a-b346-43c2ebe9a2e0","Type":"ContainerStarted","Data":"17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc"} Mar 20 08:25:57 crc kubenswrapper[4903]: I0320 08:25:57.010647 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbqcg" event={"ID":"e6a63c01-6cfe-4d24-835a-4fa810111888","Type":"ContainerStarted","Data":"a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7"} Mar 20 08:25:57 crc kubenswrapper[4903]: I0320 08:25:57.011889 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-spjl5"] Mar 20 08:25:57 crc kubenswrapper[4903]: I0320 08:25:57.030462 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-spjl5"] Mar 20 08:25:57 crc kubenswrapper[4903]: I0320 08:25:57.035524 4903 generic.go:334] "Generic (PLEG): container finished" podID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerID="c560f34e05dadace2322c63ec340136826d19f79777ff84b50cc5c8892f3a460" exitCode=0 Mar 20 08:25:57 crc kubenswrapper[4903]: I0320 08:25:57.035605 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crf95" event={"ID":"85039fed-a0e7-4cea-834c-930d1c9974a1","Type":"ContainerDied","Data":"c560f34e05dadace2322c63ec340136826d19f79777ff84b50cc5c8892f3a460"} Mar 20 08:25:57 crc kubenswrapper[4903]: E0320 08:25:57.043979 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l45gc" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" Mar 20 08:25:57 crc kubenswrapper[4903]: E0320 08:25:57.045013 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-nwkzx" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" Mar 20 08:25:57 crc kubenswrapper[4903]: I0320 08:25:57.045434 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d874b96b-vslql"] Mar 20 08:25:57 crc kubenswrapper[4903]: I0320 08:25:57.062108 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d874b96b-vslql"] Mar 20 08:25:57 crc kubenswrapper[4903]: I0320 08:25:57.148622 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" podStartSLOduration=15.148594558 podStartE2EDuration="15.148594558s" podCreationTimestamp="2026-03-20 08:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:57.123328621 +0000 UTC m=+182.340228936" watchObservedRunningTime="2026-03-20 08:25:57.148594558 +0000 UTC m=+182.365494873" Mar 20 08:25:57 crc kubenswrapper[4903]: I0320 08:25:57.498106 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="433bd27c-a67a-4487-b09e-523fd9b34b8f" path="/var/lib/kubelet/pods/433bd27c-a67a-4487-b09e-523fd9b34b8f/volumes" Mar 20 08:25:57 crc kubenswrapper[4903]: I0320 08:25:57.498679 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="578afe8a-2374-4893-bd24-5048cd759a3a" path="/var/lib/kubelet/pods/578afe8a-2374-4893-bd24-5048cd759a3a/volumes" Mar 20 08:25:58 crc kubenswrapper[4903]: I0320 08:25:58.046136 4903 generic.go:334] "Generic (PLEG): container finished" podID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerID="17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc" exitCode=0 Mar 20 08:25:58 crc kubenswrapper[4903]: I0320 08:25:58.046479 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbz9z" event={"ID":"37ce866b-65c1-454a-b346-43c2ebe9a2e0","Type":"ContainerDied","Data":"17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc"} Mar 20 08:25:58 crc kubenswrapper[4903]: I0320 08:25:58.048564 4903 generic.go:334] "Generic (PLEG): container finished" podID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerID="a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7" exitCode=0 Mar 20 08:25:58 crc kubenswrapper[4903]: I0320 08:25:58.048631 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbqcg" event={"ID":"e6a63c01-6cfe-4d24-835a-4fa810111888","Type":"ContainerDied","Data":"a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7"} Mar 20 08:25:58 crc kubenswrapper[4903]: I0320 08:25:58.513934 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.060613 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmcws" event={"ID":"d209b7b2-53ad-4780-a13e-65d2b0cb5189","Type":"ContainerStarted","Data":"e0286b99a7999c1ae210fe1909961418702574edc6dac678391994f6061a97ee"} Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.091908 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.091886702 podStartE2EDuration="1.091886702s" podCreationTimestamp="2026-03-20 08:25:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:25:59.085954955 +0000 UTC m=+184.302855270" watchObservedRunningTime="2026-03-20 08:25:59.091886702 +0000 UTC m=+184.308787017" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.121512 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bmcws" podStartSLOduration=2.334041776 podStartE2EDuration="34.121490001s" podCreationTimestamp="2026-03-20 08:25:25 +0000 UTC" firstStartedPulling="2026-03-20 08:25:26.453159081 +0000 UTC m=+151.670059386" lastFinishedPulling="2026-03-20 08:25:58.240607306 +0000 UTC m=+183.457507611" observedRunningTime="2026-03-20 08:25:59.118568609 +0000 UTC m=+184.335468924" watchObservedRunningTime="2026-03-20 08:25:59.121490001 +0000 UTC m=+184.338390336" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.206833 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 08:25:59 crc kubenswrapper[4903]: E0320 08:25:59.207996 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433bd27c-a67a-4487-b09e-523fd9b34b8f" containerName="kube-multus-additional-cni-plugins" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.208020 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="433bd27c-a67a-4487-b09e-523fd9b34b8f" containerName="kube-multus-additional-cni-plugins" Mar 20 08:25:59 crc kubenswrapper[4903]: E0320 08:25:59.208069 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578afe8a-2374-4893-bd24-5048cd759a3a" containerName="controller-manager" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.208101 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="578afe8a-2374-4893-bd24-5048cd759a3a" containerName="controller-manager" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.208331 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="433bd27c-a67a-4487-b09e-523fd9b34b8f" containerName="kube-multus-additional-cni-plugins" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.208353 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="578afe8a-2374-4893-bd24-5048cd759a3a" containerName="controller-manager" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.208902 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.216269 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.217160 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.224907 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.357900 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.357978 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.459659 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.459738 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.460162 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.484887 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 08:25:59 crc kubenswrapper[4903]: I0320 08:25:59.584746 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.041150 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.080149 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e","Type":"ContainerStarted","Data":"c51be6836e2fa8d9a0f77a81a7768f6e16dc1452ab765fc71fd36209f2ebcd49"} Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.195939 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566586-87jzv"] Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.197211 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566586-87jzv" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.202927 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.203416 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.203428 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.207212 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566586-87jzv"] Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.298268 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s8hx\" (UniqueName: \"kubernetes.io/projected/a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f-kube-api-access-4s8hx\") pod \"auto-csr-approver-29566586-87jzv\" (UID: \"a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f\") " pod="openshift-infra/auto-csr-approver-29566586-87jzv" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.379476 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb"] Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.381922 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.387825 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.387961 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.388131 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.389906 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.390130 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.390987 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.392389 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb"] Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.400412 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.402990 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s8hx\" (UniqueName: \"kubernetes.io/projected/a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f-kube-api-access-4s8hx\") pod \"auto-csr-approver-29566586-87jzv\" (UID: \"a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f\") " pod="openshift-infra/auto-csr-approver-29566586-87jzv" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.433138 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s8hx\" (UniqueName: \"kubernetes.io/projected/a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f-kube-api-access-4s8hx\") pod \"auto-csr-approver-29566586-87jzv\" (UID: \"a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f\") " pod="openshift-infra/auto-csr-approver-29566586-87jzv" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.504326 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-proxy-ca-bundles\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.504443 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h42lm\" (UniqueName: \"kubernetes.io/projected/0a99e4e6-fc7c-4111-b36d-d259cae174e5-kube-api-access-h42lm\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.504485 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-config\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.504523 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-client-ca\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.504544 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a99e4e6-fc7c-4111-b36d-d259cae174e5-serving-cert\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.521206 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566586-87jzv" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.605897 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-proxy-ca-bundles\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.605994 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h42lm\" (UniqueName: \"kubernetes.io/projected/0a99e4e6-fc7c-4111-b36d-d259cae174e5-kube-api-access-h42lm\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.606052 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-config\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.606097 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-client-ca\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.606118 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a99e4e6-fc7c-4111-b36d-d259cae174e5-serving-cert\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.608134 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-config\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.612795 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-client-ca\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.615327 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-proxy-ca-bundles\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.629102 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h42lm\" (UniqueName: \"kubernetes.io/projected/0a99e4e6-fc7c-4111-b36d-d259cae174e5-kube-api-access-h42lm\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.629588 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a99e4e6-fc7c-4111-b36d-d259cae174e5-serving-cert\") pod \"controller-manager-7b77f4cd7b-vnwbb\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.712541 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:00 crc kubenswrapper[4903]: I0320 08:26:00.822993 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566586-87jzv"] Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.069167 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb"] Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.094445 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" event={"ID":"0a99e4e6-fc7c-4111-b36d-d259cae174e5","Type":"ContainerStarted","Data":"14e4e1136d735ed5f43f1dfe1db99c11a6c3439ffb6dc47f752d0c8572892f19"} Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.101505 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxn5d" event={"ID":"48da3e1a-ed3d-4048-8f10-39f1cc56d9af","Type":"ContainerStarted","Data":"71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc"} Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.107006 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crf95" event={"ID":"85039fed-a0e7-4cea-834c-930d1c9974a1","Type":"ContainerStarted","Data":"98824d1bf61034d9a4492dcd047c63878534f968ce79b76efac3d2f2180f278f"} Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.109441 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzcbm" event={"ID":"738071fe-1a7b-403b-ab94-8e88d5d79ab4","Type":"ContainerStarted","Data":"8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a"} Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.112108 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566586-87jzv" event={"ID":"a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f","Type":"ContainerStarted","Data":"ff38fa65c4b24b3cdf86194c82e7070487815efea6ae19f982212b19f9864afd"} Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.131786 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbz9z" event={"ID":"37ce866b-65c1-454a-b346-43c2ebe9a2e0","Type":"ContainerStarted","Data":"34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112"} Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.132364 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cxn5d" podStartSLOduration=3.452807343 podStartE2EDuration="35.132345496s" podCreationTimestamp="2026-03-20 08:25:26 +0000 UTC" firstStartedPulling="2026-03-20 08:25:28.552194066 +0000 UTC m=+153.769094381" lastFinishedPulling="2026-03-20 08:26:00.231732209 +0000 UTC m=+185.448632534" observedRunningTime="2026-03-20 08:26:01.123663094 +0000 UTC m=+186.340563409" watchObservedRunningTime="2026-03-20 08:26:01.132345496 +0000 UTC m=+186.349245811" Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.134103 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e","Type":"ContainerStarted","Data":"7ea6ac5690a8ef9a4de0e80c2aa27398fbb9153307c26a7b910f740d2e30b344"} Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.152414 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbqcg" event={"ID":"e6a63c01-6cfe-4d24-835a-4fa810111888","Type":"ContainerStarted","Data":"be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919"} Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.176175 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-crf95" podStartSLOduration=4.295549522 podStartE2EDuration="34.176149973s" podCreationTimestamp="2026-03-20 08:25:27 +0000 UTC" firstStartedPulling="2026-03-20 08:25:29.613545674 +0000 UTC m=+154.830445989" lastFinishedPulling="2026-03-20 08:25:59.494146125 +0000 UTC m=+184.711046440" observedRunningTime="2026-03-20 08:26:01.169719063 +0000 UTC m=+186.386619398" watchObservedRunningTime="2026-03-20 08:26:01.176149973 +0000 UTC m=+186.393050288" Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.176479 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fzcbm" podStartSLOduration=3.095402325 podStartE2EDuration="37.176473132s" podCreationTimestamp="2026-03-20 08:25:24 +0000 UTC" firstStartedPulling="2026-03-20 08:25:26.426867665 +0000 UTC m=+151.643767980" lastFinishedPulling="2026-03-20 08:26:00.507938472 +0000 UTC m=+185.724838787" observedRunningTime="2026-03-20 08:26:01.151721419 +0000 UTC m=+186.368621724" watchObservedRunningTime="2026-03-20 08:26:01.176473132 +0000 UTC m=+186.393373447" Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.200738 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pbz9z" podStartSLOduration=3.610739135 podStartE2EDuration="34.200713601s" podCreationTimestamp="2026-03-20 08:25:27 +0000 UTC" firstStartedPulling="2026-03-20 08:25:29.651793044 +0000 UTC m=+154.868693349" lastFinishedPulling="2026-03-20 08:26:00.24176751 +0000 UTC m=+185.458667815" observedRunningTime="2026-03-20 08:26:01.199858377 +0000 UTC m=+186.416758692" watchObservedRunningTime="2026-03-20 08:26:01.200713601 +0000 UTC m=+186.417613916" Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.221985 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.221964036 podStartE2EDuration="2.221964036s" podCreationTimestamp="2026-03-20 08:25:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:26:01.2188783 +0000 UTC m=+186.435778615" watchObservedRunningTime="2026-03-20 08:26:01.221964036 +0000 UTC m=+186.438864351" Mar 20 08:26:01 crc kubenswrapper[4903]: I0320 08:26:01.241463 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cbqcg" podStartSLOduration=3.435949211 podStartE2EDuration="33.241436111s" podCreationTimestamp="2026-03-20 08:25:28 +0000 UTC" firstStartedPulling="2026-03-20 08:25:30.699265724 +0000 UTC m=+155.916166029" lastFinishedPulling="2026-03-20 08:26:00.504752614 +0000 UTC m=+185.721652929" observedRunningTime="2026-03-20 08:26:01.241103112 +0000 UTC m=+186.458003427" watchObservedRunningTime="2026-03-20 08:26:01.241436111 +0000 UTC m=+186.458336426" Mar 20 08:26:02 crc kubenswrapper[4903]: I0320 08:26:02.161523 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" event={"ID":"0a99e4e6-fc7c-4111-b36d-d259cae174e5","Type":"ContainerStarted","Data":"2eb2f7d6bae59867fe851577b84084fd916dd5e98ab3a1ed8df48ec97d9afc08"} Mar 20 08:26:02 crc kubenswrapper[4903]: I0320 08:26:02.163467 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:02 crc kubenswrapper[4903]: I0320 08:26:02.165024 4903 generic.go:334] "Generic (PLEG): container finished" podID="d4534fd8-487d-48d0-ba5b-0ab55cd3e71e" containerID="7ea6ac5690a8ef9a4de0e80c2aa27398fbb9153307c26a7b910f740d2e30b344" exitCode=0 Mar 20 08:26:02 crc kubenswrapper[4903]: I0320 08:26:02.165092 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e","Type":"ContainerDied","Data":"7ea6ac5690a8ef9a4de0e80c2aa27398fbb9153307c26a7b910f740d2e30b344"} Mar 20 08:26:02 crc kubenswrapper[4903]: I0320 08:26:02.167040 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:02 crc kubenswrapper[4903]: I0320 08:26:02.190711 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" podStartSLOduration=20.190686271 podStartE2EDuration="20.190686271s" podCreationTimestamp="2026-03-20 08:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:26:02.187913623 +0000 UTC m=+187.404813938" watchObservedRunningTime="2026-03-20 08:26:02.190686271 +0000 UTC m=+187.407586586" Mar 20 08:26:02 crc kubenswrapper[4903]: I0320 08:26:02.440835 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 08:26:02 crc kubenswrapper[4903]: I0320 08:26:02.453810 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb"] Mar 20 08:26:02 crc kubenswrapper[4903]: I0320 08:26:02.560094 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt"] Mar 20 08:26:02 crc kubenswrapper[4903]: I0320 08:26:02.560718 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" podUID="d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e" containerName="route-controller-manager" containerID="cri-o://88c6e5b0a5d400533e31429ee73c5bab50d41296c6051bf8940c97e9f3316db6" gracePeriod=30 Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.172958 4903 generic.go:334] "Generic (PLEG): container finished" podID="d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e" containerID="88c6e5b0a5d400533e31429ee73c5bab50d41296c6051bf8940c97e9f3316db6" exitCode=0 Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.173231 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" event={"ID":"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e","Type":"ContainerDied","Data":"88c6e5b0a5d400533e31429ee73c5bab50d41296c6051bf8940c97e9f3316db6"} Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.496167 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.642861 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.674670 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kube-api-access\") pod \"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e\" (UID: \"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e\") " Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.674730 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kubelet-dir\") pod \"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e\" (UID: \"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e\") " Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.674897 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d4534fd8-487d-48d0-ba5b-0ab55cd3e71e" (UID: "d4534fd8-487d-48d0-ba5b-0ab55cd3e71e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.675374 4903 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.691750 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d4534fd8-487d-48d0-ba5b-0ab55cd3e71e" (UID: "d4534fd8-487d-48d0-ba5b-0ab55cd3e71e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.777632 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-serving-cert\") pod \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.777684 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-config\") pod \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.777755 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-client-ca\") pod \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.777813 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2xbv\" (UniqueName: \"kubernetes.io/projected/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-kube-api-access-d2xbv\") pod \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\" (UID: \"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e\") " Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.779158 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-config" (OuterVolumeSpecName: "config") pod "d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e" (UID: "d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.779407 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-client-ca" (OuterVolumeSpecName: "client-ca") pod "d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e" (UID: "d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.779556 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4534fd8-487d-48d0-ba5b-0ab55cd3e71e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.784480 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e" (UID: "d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.805723 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-kube-api-access-d2xbv" (OuterVolumeSpecName: "kube-api-access-d2xbv") pod "d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e" (UID: "d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e"). InnerVolumeSpecName "kube-api-access-d2xbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.880738 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.880777 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2xbv\" (UniqueName: \"kubernetes.io/projected/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-kube-api-access-d2xbv\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.880787 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:03 crc kubenswrapper[4903]: I0320 08:26:03.880798 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.188847 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d4534fd8-487d-48d0-ba5b-0ab55cd3e71e","Type":"ContainerDied","Data":"c51be6836e2fa8d9a0f77a81a7768f6e16dc1452ab765fc71fd36209f2ebcd49"} Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.188916 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c51be6836e2fa8d9a0f77a81a7768f6e16dc1452ab765fc71fd36209f2ebcd49" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.188987 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.194946 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" event={"ID":"d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e","Type":"ContainerDied","Data":"f5affc269af10dccdc47aaa65e111afd3cdb1b0988d5d93275ab75aa0543f34f"} Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.194994 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.195023 4903 scope.go:117] "RemoveContainer" containerID="88c6e5b0a5d400533e31429ee73c5bab50d41296c6051bf8940c97e9f3316db6" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.195090 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" podUID="0a99e4e6-fc7c-4111-b36d-d259cae174e5" containerName="controller-manager" containerID="cri-o://2eb2f7d6bae59867fe851577b84084fd916dd5e98ab3a1ed8df48ec97d9afc08" gracePeriod=30 Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.234198 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt"] Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.238498 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-585d79454d-jxlmt"] Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.380607 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b"] Mar 20 08:26:04 crc kubenswrapper[4903]: E0320 08:26:04.380995 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4534fd8-487d-48d0-ba5b-0ab55cd3e71e" containerName="pruner" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.381018 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4534fd8-487d-48d0-ba5b-0ab55cd3e71e" containerName="pruner" Mar 20 08:26:04 crc kubenswrapper[4903]: E0320 08:26:04.381056 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e" containerName="route-controller-manager" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.381063 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e" containerName="route-controller-manager" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.381184 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e" containerName="route-controller-manager" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.381207 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4534fd8-487d-48d0-ba5b-0ab55cd3e71e" containerName="pruner" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.381756 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.387308 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.387702 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.387838 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.387887 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.388156 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.388392 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.392276 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b"] Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.502928 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6119b58f-753f-4927-be23-bd4cb0793ac4-serving-cert\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.503102 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-config\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.503143 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-client-ca\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.503165 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htjs4\" (UniqueName: \"kubernetes.io/projected/6119b58f-753f-4927-be23-bd4cb0793ac4-kube-api-access-htjs4\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.603700 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6119b58f-753f-4927-be23-bd4cb0793ac4-serving-cert\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.603759 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-config\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.603800 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-client-ca\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.603821 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htjs4\" (UniqueName: \"kubernetes.io/projected/6119b58f-753f-4927-be23-bd4cb0793ac4-kube-api-access-htjs4\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.606221 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-config\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.606264 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-client-ca\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.613262 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6119b58f-753f-4927-be23-bd4cb0793ac4-serving-cert\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.625141 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx6"] Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.635560 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htjs4\" (UniqueName: \"kubernetes.io/projected/6119b58f-753f-4927-be23-bd4cb0793ac4-kube-api-access-htjs4\") pod \"route-controller-manager-6bd78bd469-q522b\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:04 crc kubenswrapper[4903]: I0320 08:26:04.773285 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.004542 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.007156 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.009377 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.009433 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12081c8c-e4be-4b92-8e36-e39afc95015a-kube-api-access\") pod \"installer-9-crc\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.009498 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-var-lock\") pod \"installer-9-crc\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.013444 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.013740 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.058514 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.112603 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-var-lock\") pod \"installer-9-crc\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.112653 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-var-lock\") pod \"installer-9-crc\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.113135 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.113198 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12081c8c-e4be-4b92-8e36-e39afc95015a-kube-api-access\") pod \"installer-9-crc\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.113679 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.138316 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12081c8c-e4be-4b92-8e36-e39afc95015a-kube-api-access\") pod \"installer-9-crc\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.205468 4903 generic.go:334] "Generic (PLEG): container finished" podID="0a99e4e6-fc7c-4111-b36d-d259cae174e5" containerID="2eb2f7d6bae59867fe851577b84084fd916dd5e98ab3a1ed8df48ec97d9afc08" exitCode=0 Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.205522 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" event={"ID":"0a99e4e6-fc7c-4111-b36d-d259cae174e5","Type":"ContainerDied","Data":"2eb2f7d6bae59867fe851577b84084fd916dd5e98ab3a1ed8df48ec97d9afc08"} Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.226098 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.226454 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.331484 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.338871 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b"] Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.473180 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.473772 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.500821 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.505004 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e" path="/var/lib/kubelet/pods/d0a6fcd5-f9de-49c8-8802-ee2ccfa1a77e/volumes" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.518363 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a99e4e6-fc7c-4111-b36d-d259cae174e5-serving-cert\") pod \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.518438 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-client-ca\") pod \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.518566 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h42lm\" (UniqueName: \"kubernetes.io/projected/0a99e4e6-fc7c-4111-b36d-d259cae174e5-kube-api-access-h42lm\") pod \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.518599 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-config\") pod \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.518686 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-proxy-ca-bundles\") pod \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\" (UID: \"0a99e4e6-fc7c-4111-b36d-d259cae174e5\") " Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.519903 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-client-ca" (OuterVolumeSpecName: "client-ca") pod "0a99e4e6-fc7c-4111-b36d-d259cae174e5" (UID: "0a99e4e6-fc7c-4111-b36d-d259cae174e5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.520053 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-config" (OuterVolumeSpecName: "config") pod "0a99e4e6-fc7c-4111-b36d-d259cae174e5" (UID: "0a99e4e6-fc7c-4111-b36d-d259cae174e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.520102 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0a99e4e6-fc7c-4111-b36d-d259cae174e5" (UID: "0a99e4e6-fc7c-4111-b36d-d259cae174e5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.535285 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a99e4e6-fc7c-4111-b36d-d259cae174e5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0a99e4e6-fc7c-4111-b36d-d259cae174e5" (UID: "0a99e4e6-fc7c-4111-b36d-d259cae174e5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.535481 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a99e4e6-fc7c-4111-b36d-d259cae174e5-kube-api-access-h42lm" (OuterVolumeSpecName: "kube-api-access-h42lm") pod "0a99e4e6-fc7c-4111-b36d-d259cae174e5" (UID: "0a99e4e6-fc7c-4111-b36d-d259cae174e5"). InnerVolumeSpecName "kube-api-access-h42lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.620123 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.620167 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a99e4e6-fc7c-4111-b36d-d259cae174e5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.620177 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.620188 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h42lm\" (UniqueName: \"kubernetes.io/projected/0a99e4e6-fc7c-4111-b36d-d259cae174e5-kube-api-access-h42lm\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.620202 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a99e4e6-fc7c-4111-b36d-d259cae174e5-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.626694 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.824128 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:26:05 crc kubenswrapper[4903]: I0320 08:26:05.835854 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.215095 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" event={"ID":"0a99e4e6-fc7c-4111-b36d-d259cae174e5","Type":"ContainerDied","Data":"14e4e1136d735ed5f43f1dfe1db99c11a6c3439ffb6dc47f752d0c8572892f19"} Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.215133 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.215161 4903 scope.go:117] "RemoveContainer" containerID="2eb2f7d6bae59867fe851577b84084fd916dd5e98ab3a1ed8df48ec97d9afc08" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.222145 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"12081c8c-e4be-4b92-8e36-e39afc95015a","Type":"ContainerStarted","Data":"1406d6ad56c694136d708a44b3a0e75399cb8dcb92497f6d42d6e9c8ca47de44"} Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.222203 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"12081c8c-e4be-4b92-8e36-e39afc95015a","Type":"ContainerStarted","Data":"1da2af238f1f27766f04d15b81b9d78e04f3946e3f7f83616d1c38b5b448939f"} Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.230873 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" event={"ID":"6119b58f-753f-4927-be23-bd4cb0793ac4","Type":"ContainerStarted","Data":"806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09"} Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.230943 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" event={"ID":"6119b58f-753f-4927-be23-bd4cb0793ac4","Type":"ContainerStarted","Data":"a6f56857487c5271bbcbbc46dae7b9447ca2b2585ede2b8bb05cd70ef7efd3f5"} Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.266843 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" podStartSLOduration=4.266817266 podStartE2EDuration="4.266817266s" podCreationTimestamp="2026-03-20 08:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:26:06.256943499 +0000 UTC m=+191.473843804" watchObservedRunningTime="2026-03-20 08:26:06.266817266 +0000 UTC m=+191.483717581" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.270782 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb"] Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.277167 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b77f4cd7b-vnwbb"] Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.299155 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.305569 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.381715 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b46f85f64-bqnzm"] Mar 20 08:26:06 crc kubenswrapper[4903]: E0320 08:26:06.382472 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a99e4e6-fc7c-4111-b36d-d259cae174e5" containerName="controller-manager" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.382501 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a99e4e6-fc7c-4111-b36d-d259cae174e5" containerName="controller-manager" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.382649 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a99e4e6-fc7c-4111-b36d-d259cae174e5" containerName="controller-manager" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.383227 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.389412 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.389500 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.392354 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.394792 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.403891 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.404597 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.408878 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.433081 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b46f85f64-bqnzm"] Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.448475 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-client-ca\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.448708 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-proxy-ca-bundles\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.449018 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcss\" (UniqueName: \"kubernetes.io/projected/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-kube-api-access-4wcss\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.449083 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-serving-cert\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.449155 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-config\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.491344 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:26:06 crc kubenswrapper[4903]: E0320 08:26:06.491566 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.557898 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcss\" (UniqueName: \"kubernetes.io/projected/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-kube-api-access-4wcss\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.557995 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-serving-cert\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.558047 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-config\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.558086 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-client-ca\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.558121 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-proxy-ca-bundles\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.559863 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-client-ca\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.560113 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-proxy-ca-bundles\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.560329 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-config\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.566429 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-serving-cert\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.578498 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcss\" (UniqueName: \"kubernetes.io/projected/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-kube-api-access-4wcss\") pod \"controller-manager-6b46f85f64-bqnzm\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.714044 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:06 crc kubenswrapper[4903]: I0320 08:26:06.966377 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b46f85f64-bqnzm"] Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.212007 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.213270 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.241359 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" event={"ID":"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4","Type":"ContainerStarted","Data":"5e4f658da78d21657a7ad8da3a09fa0002c2bfe07228842241630da3436ffc5a"} Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.243416 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.253284 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.262267 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.262243449 podStartE2EDuration="3.262243449s" podCreationTimestamp="2026-03-20 08:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:26:07.257548187 +0000 UTC m=+192.474448502" watchObservedRunningTime="2026-03-20 08:26:07.262243449 +0000 UTC m=+192.479143764" Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.267233 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.503825 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a99e4e6-fc7c-4111-b36d-d259cae174e5" path="/var/lib/kubelet/pods/0a99e4e6-fc7c-4111-b36d-d259cae174e5/volumes" Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.634226 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.634295 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:26:07 crc kubenswrapper[4903]: I0320 08:26:07.696571 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:26:08 crc kubenswrapper[4903]: I0320 08:26:08.187799 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:26:08 crc kubenswrapper[4903]: I0320 08:26:08.187955 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:26:08 crc kubenswrapper[4903]: I0320 08:26:08.253318 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" event={"ID":"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4","Type":"ContainerStarted","Data":"e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d"} Mar 20 08:26:08 crc kubenswrapper[4903]: I0320 08:26:08.288043 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" podStartSLOduration=6.28798088 podStartE2EDuration="6.28798088s" podCreationTimestamp="2026-03-20 08:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:26:08.276628712 +0000 UTC m=+193.493529037" watchObservedRunningTime="2026-03-20 08:26:08.28798088 +0000 UTC m=+193.504881275" Mar 20 08:26:08 crc kubenswrapper[4903]: I0320 08:26:08.318540 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:26:08 crc kubenswrapper[4903]: I0320 08:26:08.336643 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:26:08 crc kubenswrapper[4903]: I0320 08:26:08.660777 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:26:08 crc kubenswrapper[4903]: I0320 08:26:08.660826 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:26:09 crc kubenswrapper[4903]: I0320 08:26:09.232379 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pbz9z" podUID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerName="registry-server" probeResult="failure" output=< Mar 20 08:26:09 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Mar 20 08:26:09 crc kubenswrapper[4903]: > Mar 20 08:26:09 crc kubenswrapper[4903]: I0320 08:26:09.266659 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bmcws"] Mar 20 08:26:09 crc kubenswrapper[4903]: I0320 08:26:09.266956 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bmcws" podUID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerName="registry-server" containerID="cri-o://e0286b99a7999c1ae210fe1909961418702574edc6dac678391994f6061a97ee" gracePeriod=2 Mar 20 08:26:09 crc kubenswrapper[4903]: I0320 08:26:09.271652 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:09 crc kubenswrapper[4903]: I0320 08:26:09.278352 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:09 crc kubenswrapper[4903]: I0320 08:26:09.716514 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cbqcg" podUID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerName="registry-server" probeResult="failure" output=< Mar 20 08:26:09 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Mar 20 08:26:09 crc kubenswrapper[4903]: > Mar 20 08:26:10 crc kubenswrapper[4903]: I0320 08:26:10.279708 4903 generic.go:334] "Generic (PLEG): container finished" podID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerID="e0286b99a7999c1ae210fe1909961418702574edc6dac678391994f6061a97ee" exitCode=0 Mar 20 08:26:10 crc kubenswrapper[4903]: I0320 08:26:10.280545 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmcws" event={"ID":"d209b7b2-53ad-4780-a13e-65d2b0cb5189","Type":"ContainerDied","Data":"e0286b99a7999c1ae210fe1909961418702574edc6dac678391994f6061a97ee"} Mar 20 08:26:11 crc kubenswrapper[4903]: I0320 08:26:11.471726 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-crf95"] Mar 20 08:26:11 crc kubenswrapper[4903]: I0320 08:26:11.473122 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-crf95" podUID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerName="registry-server" containerID="cri-o://98824d1bf61034d9a4492dcd047c63878534f968ce79b76efac3d2f2180f278f" gracePeriod=2 Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.306530 4903 generic.go:334] "Generic (PLEG): container finished" podID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerID="98824d1bf61034d9a4492dcd047c63878534f968ce79b76efac3d2f2180f278f" exitCode=0 Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.306589 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crf95" event={"ID":"85039fed-a0e7-4cea-834c-930d1c9974a1","Type":"ContainerDied","Data":"98824d1bf61034d9a4492dcd047c63878534f968ce79b76efac3d2f2180f278f"} Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.423544 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.485649 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsb9h\" (UniqueName: \"kubernetes.io/projected/d209b7b2-53ad-4780-a13e-65d2b0cb5189-kube-api-access-zsb9h\") pod \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.485916 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-catalog-content\") pod \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.485952 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-utilities\") pod \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\" (UID: \"d209b7b2-53ad-4780-a13e-65d2b0cb5189\") " Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.487167 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-utilities" (OuterVolumeSpecName: "utilities") pod "d209b7b2-53ad-4780-a13e-65d2b0cb5189" (UID: "d209b7b2-53ad-4780-a13e-65d2b0cb5189"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.498285 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d209b7b2-53ad-4780-a13e-65d2b0cb5189-kube-api-access-zsb9h" (OuterVolumeSpecName: "kube-api-access-zsb9h") pod "d209b7b2-53ad-4780-a13e-65d2b0cb5189" (UID: "d209b7b2-53ad-4780-a13e-65d2b0cb5189"). InnerVolumeSpecName "kube-api-access-zsb9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.544280 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d209b7b2-53ad-4780-a13e-65d2b0cb5189" (UID: "d209b7b2-53ad-4780-a13e-65d2b0cb5189"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.587528 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.587571 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d209b7b2-53ad-4780-a13e-65d2b0cb5189-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.587585 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsb9h\" (UniqueName: \"kubernetes.io/projected/d209b7b2-53ad-4780-a13e-65d2b0cb5189-kube-api-access-zsb9h\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.741267 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.791266 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-utilities\") pod \"85039fed-a0e7-4cea-834c-930d1c9974a1\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.791325 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pd2r\" (UniqueName: \"kubernetes.io/projected/85039fed-a0e7-4cea-834c-930d1c9974a1-kube-api-access-7pd2r\") pod \"85039fed-a0e7-4cea-834c-930d1c9974a1\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.791366 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-catalog-content\") pod \"85039fed-a0e7-4cea-834c-930d1c9974a1\" (UID: \"85039fed-a0e7-4cea-834c-930d1c9974a1\") " Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.799247 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85039fed-a0e7-4cea-834c-930d1c9974a1-kube-api-access-7pd2r" (OuterVolumeSpecName: "kube-api-access-7pd2r") pod "85039fed-a0e7-4cea-834c-930d1c9974a1" (UID: "85039fed-a0e7-4cea-834c-930d1c9974a1"). InnerVolumeSpecName "kube-api-access-7pd2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.799798 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-utilities" (OuterVolumeSpecName: "utilities") pod "85039fed-a0e7-4cea-834c-930d1c9974a1" (UID: "85039fed-a0e7-4cea-834c-930d1c9974a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.824678 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85039fed-a0e7-4cea-834c-930d1c9974a1" (UID: "85039fed-a0e7-4cea-834c-930d1c9974a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.893510 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.893556 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pd2r\" (UniqueName: \"kubernetes.io/projected/85039fed-a0e7-4cea-834c-930d1c9974a1-kube-api-access-7pd2r\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:12 crc kubenswrapper[4903]: I0320 08:26:12.893570 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85039fed-a0e7-4cea-834c-930d1c9974a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.215153 4903 csr.go:261] certificate signing request csr-lvx8j is approved, waiting to be issued Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.224075 4903 csr.go:257] certificate signing request csr-lvx8j is issued Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.317759 4903 generic.go:334] "Generic (PLEG): container finished" podID="a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f" containerID="781971432a2d34c14a048a7ba8a4fc1acb64225c8b94b7eebf0b09d61d19665b" exitCode=0 Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.317836 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566586-87jzv" event={"ID":"a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f","Type":"ContainerDied","Data":"781971432a2d34c14a048a7ba8a4fc1acb64225c8b94b7eebf0b09d61d19665b"} Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.321123 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmcws" event={"ID":"d209b7b2-53ad-4780-a13e-65d2b0cb5189","Type":"ContainerDied","Data":"5ac0e9bd1183ec46a5a16f923a7fe2d228f6830013ac90eaff5dbfdaa870bcf9"} Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.321178 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmcws" Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.321207 4903 scope.go:117] "RemoveContainer" containerID="e0286b99a7999c1ae210fe1909961418702574edc6dac678391994f6061a97ee" Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.333559 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45gc" event={"ID":"efb0ecbf-eb11-4834-8e12-668b3b9f64c8","Type":"ContainerStarted","Data":"cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0"} Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.337970 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-crf95" event={"ID":"85039fed-a0e7-4cea-834c-930d1c9974a1","Type":"ContainerDied","Data":"8e0d0af934a639abaa441d8a4bdd8bd1bd069d64fe460efa085ff1d4f2d56ec1"} Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.338134 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-crf95" Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.357312 4903 scope.go:117] "RemoveContainer" containerID="a4f90216b0c31bdff22706d0d8399c18d91e85da60af0917c1c404451d5382b1" Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.382204 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bmcws"] Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.386463 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bmcws"] Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.423175 4903 scope.go:117] "RemoveContainer" containerID="d78f2d3a79a13018ed4c3f7d273be342a36223b8970cbe7d118c4f455a83d264" Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.441129 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-crf95"] Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.447060 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-crf95"] Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.498942 4903 scope.go:117] "RemoveContainer" containerID="98824d1bf61034d9a4492dcd047c63878534f968ce79b76efac3d2f2180f278f" Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.502551 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85039fed-a0e7-4cea-834c-930d1c9974a1" path="/var/lib/kubelet/pods/85039fed-a0e7-4cea-834c-930d1c9974a1/volumes" Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.503368 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" path="/var/lib/kubelet/pods/d209b7b2-53ad-4780-a13e-65d2b0cb5189/volumes" Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.519768 4903 scope.go:117] "RemoveContainer" containerID="c560f34e05dadace2322c63ec340136826d19f79777ff84b50cc5c8892f3a460" Mar 20 08:26:13 crc kubenswrapper[4903]: I0320 08:26:13.538367 4903 scope.go:117] "RemoveContainer" containerID="a25b642b782e36bf09323533a2def2c5ae319466a12e22f89427181512ca2c00" Mar 20 08:26:14 crc kubenswrapper[4903]: I0320 08:26:14.225199 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-25 09:30:52.217601862 +0000 UTC Mar 20 08:26:14 crc kubenswrapper[4903]: I0320 08:26:14.225252 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6721h4m37.992352998s for next certificate rotation Mar 20 08:26:14 crc kubenswrapper[4903]: I0320 08:26:14.348203 4903 generic.go:334] "Generic (PLEG): container finished" podID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerID="cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0" exitCode=0 Mar 20 08:26:14 crc kubenswrapper[4903]: I0320 08:26:14.348280 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45gc" event={"ID":"efb0ecbf-eb11-4834-8e12-668b3b9f64c8","Type":"ContainerDied","Data":"cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0"} Mar 20 08:26:14 crc kubenswrapper[4903]: I0320 08:26:14.353962 4903 generic.go:334] "Generic (PLEG): container finished" podID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerID="36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338" exitCode=0 Mar 20 08:26:14 crc kubenswrapper[4903]: I0320 08:26:14.354189 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nwkzx" event={"ID":"e74b75cf-cad8-4b67-91b7-3926096e09f8","Type":"ContainerDied","Data":"36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338"} Mar 20 08:26:14 crc kubenswrapper[4903]: I0320 08:26:14.709981 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566586-87jzv" Mar 20 08:26:14 crc kubenswrapper[4903]: I0320 08:26:14.833394 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s8hx\" (UniqueName: \"kubernetes.io/projected/a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f-kube-api-access-4s8hx\") pod \"a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f\" (UID: \"a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f\") " Mar 20 08:26:14 crc kubenswrapper[4903]: I0320 08:26:14.840769 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f-kube-api-access-4s8hx" (OuterVolumeSpecName: "kube-api-access-4s8hx") pod "a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f" (UID: "a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f"). InnerVolumeSpecName "kube-api-access-4s8hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:14 crc kubenswrapper[4903]: I0320 08:26:14.935923 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s8hx\" (UniqueName: \"kubernetes.io/projected/a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f-kube-api-access-4s8hx\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.225509 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-25 14:01:34.039999502 +0000 UTC Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.225548 4903 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6005h35m18.814454578s for next certificate rotation Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.363250 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45gc" event={"ID":"efb0ecbf-eb11-4834-8e12-668b3b9f64c8","Type":"ContainerStarted","Data":"b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e"} Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.367354 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nwkzx" event={"ID":"e74b75cf-cad8-4b67-91b7-3926096e09f8","Type":"ContainerStarted","Data":"3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee"} Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.370150 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566586-87jzv" event={"ID":"a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f","Type":"ContainerDied","Data":"ff38fa65c4b24b3cdf86194c82e7070487815efea6ae19f982212b19f9864afd"} Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.370195 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff38fa65c4b24b3cdf86194c82e7070487815efea6ae19f982212b19f9864afd" Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.370231 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566586-87jzv" Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.399174 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l45gc" podStartSLOduration=2.012284993 podStartE2EDuration="50.399145703s" podCreationTimestamp="2026-03-20 08:25:25 +0000 UTC" firstStartedPulling="2026-03-20 08:25:26.473976244 +0000 UTC m=+151.690876559" lastFinishedPulling="2026-03-20 08:26:14.860836914 +0000 UTC m=+200.077737269" observedRunningTime="2026-03-20 08:26:15.391718698 +0000 UTC m=+200.608619053" watchObservedRunningTime="2026-03-20 08:26:15.399145703 +0000 UTC m=+200.616046058" Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.424165 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nwkzx" podStartSLOduration=3.096250487 podStartE2EDuration="51.424133731s" podCreationTimestamp="2026-03-20 08:25:24 +0000 UTC" firstStartedPulling="2026-03-20 08:25:26.465099365 +0000 UTC m=+151.681999680" lastFinishedPulling="2026-03-20 08:26:14.792982609 +0000 UTC m=+200.009882924" observedRunningTime="2026-03-20 08:26:15.423602317 +0000 UTC m=+200.640502662" watchObservedRunningTime="2026-03-20 08:26:15.424133731 +0000 UTC m=+200.641034036" Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.602496 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:26:15 crc kubenswrapper[4903]: I0320 08:26:15.602592 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:26:16 crc kubenswrapper[4903]: I0320 08:26:16.656322 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-l45gc" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerName="registry-server" probeResult="failure" output=< Mar 20 08:26:16 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Mar 20 08:26:16 crc kubenswrapper[4903]: > Mar 20 08:26:18 crc kubenswrapper[4903]: I0320 08:26:18.725668 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:26:18 crc kubenswrapper[4903]: I0320 08:26:18.754475 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:26:18 crc kubenswrapper[4903]: I0320 08:26:18.785495 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:26:18 crc kubenswrapper[4903]: I0320 08:26:18.817003 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:26:19 crc kubenswrapper[4903]: I0320 08:26:19.492200 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:26:20 crc kubenswrapper[4903]: I0320 08:26:20.405602 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 20 08:26:20 crc kubenswrapper[4903]: I0320 08:26:20.408460 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b"} Mar 20 08:26:20 crc kubenswrapper[4903]: I0320 08:26:20.410521 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:20 crc kubenswrapper[4903]: I0320 08:26:20.664245 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.664221253 podStartE2EDuration="1m30.664221253s" podCreationTimestamp="2026-03-20 08:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:26:20.457807649 +0000 UTC m=+205.674707994" watchObservedRunningTime="2026-03-20 08:26:20.664221253 +0000 UTC m=+205.881121568" Mar 20 08:26:20 crc kubenswrapper[4903]: I0320 08:26:20.665980 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cbqcg"] Mar 20 08:26:20 crc kubenswrapper[4903]: I0320 08:26:20.666241 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cbqcg" podUID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerName="registry-server" containerID="cri-o://be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919" gracePeriod=2 Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.158145 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.241067 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-catalog-content\") pod \"e6a63c01-6cfe-4d24-835a-4fa810111888\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.241151 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxf2k\" (UniqueName: \"kubernetes.io/projected/e6a63c01-6cfe-4d24-835a-4fa810111888-kube-api-access-fxf2k\") pod \"e6a63c01-6cfe-4d24-835a-4fa810111888\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.241206 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-utilities\") pod \"e6a63c01-6cfe-4d24-835a-4fa810111888\" (UID: \"e6a63c01-6cfe-4d24-835a-4fa810111888\") " Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.242816 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-utilities" (OuterVolumeSpecName: "utilities") pod "e6a63c01-6cfe-4d24-835a-4fa810111888" (UID: "e6a63c01-6cfe-4d24-835a-4fa810111888"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.250188 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a63c01-6cfe-4d24-835a-4fa810111888-kube-api-access-fxf2k" (OuterVolumeSpecName: "kube-api-access-fxf2k") pod "e6a63c01-6cfe-4d24-835a-4fa810111888" (UID: "e6a63c01-6cfe-4d24-835a-4fa810111888"). InnerVolumeSpecName "kube-api-access-fxf2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.343202 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxf2k\" (UniqueName: \"kubernetes.io/projected/e6a63c01-6cfe-4d24-835a-4fa810111888-kube-api-access-fxf2k\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.343254 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.374150 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6a63c01-6cfe-4d24-835a-4fa810111888" (UID: "e6a63c01-6cfe-4d24-835a-4fa810111888"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.418104 4903 generic.go:334] "Generic (PLEG): container finished" podID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerID="be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919" exitCode=0 Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.418222 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbqcg" event={"ID":"e6a63c01-6cfe-4d24-835a-4fa810111888","Type":"ContainerDied","Data":"be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919"} Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.418296 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbqcg" event={"ID":"e6a63c01-6cfe-4d24-835a-4fa810111888","Type":"ContainerDied","Data":"94281610e7b9e9db7024933c01def7e03f9b7ac2d9abd6bd950c16ac4abdc6b5"} Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.418320 4903 scope.go:117] "RemoveContainer" containerID="be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.418241 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbqcg" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.439364 4903 scope.go:117] "RemoveContainer" containerID="a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.444504 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6a63c01-6cfe-4d24-835a-4fa810111888-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.464215 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cbqcg"] Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.469484 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cbqcg"] Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.473411 4903 scope.go:117] "RemoveContainer" containerID="8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.489786 4903 scope.go:117] "RemoveContainer" containerID="be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919" Mar 20 08:26:21 crc kubenswrapper[4903]: E0320 08:26:21.490566 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919\": container with ID starting with be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919 not found: ID does not exist" containerID="be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.490645 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919"} err="failed to get container status \"be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919\": rpc error: code = NotFound desc = could not find container \"be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919\": container with ID starting with be068f25179e56b5b511907b0084fc3cd7ea33aa66b3c41994c59ebc53169919 not found: ID does not exist" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.490711 4903 scope.go:117] "RemoveContainer" containerID="a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7" Mar 20 08:26:21 crc kubenswrapper[4903]: E0320 08:26:21.491491 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7\": container with ID starting with a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7 not found: ID does not exist" containerID="a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.491547 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7"} err="failed to get container status \"a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7\": rpc error: code = NotFound desc = could not find container \"a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7\": container with ID starting with a871d2c5ad1b7a3ad9d78333fd2d03aeba0b8bf48c3933d086c8b9cb97f1afc7 not found: ID does not exist" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.491580 4903 scope.go:117] "RemoveContainer" containerID="8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657" Mar 20 08:26:21 crc kubenswrapper[4903]: E0320 08:26:21.491973 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657\": container with ID starting with 8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657 not found: ID does not exist" containerID="8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.492012 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657"} err="failed to get container status \"8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657\": rpc error: code = NotFound desc = could not find container \"8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657\": container with ID starting with 8e7caf0085bbea51b6a35c455297385b534b2dc0932604a63c2063430ba8e657 not found: ID does not exist" Mar 20 08:26:21 crc kubenswrapper[4903]: I0320 08:26:21.498808 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6a63c01-6cfe-4d24-835a-4fa810111888" path="/var/lib/kubelet/pods/e6a63c01-6cfe-4d24-835a-4fa810111888/volumes" Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.390609 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b46f85f64-bqnzm"] Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.390926 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" podUID="63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" containerName="controller-manager" containerID="cri-o://e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d" gracePeriod=30 Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.399052 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b"] Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.399383 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" podUID="6119b58f-753f-4927-be23-bd4cb0793ac4" containerName="route-controller-manager" containerID="cri-o://806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09" gracePeriod=30 Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.894866 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.971689 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-client-ca\") pod \"6119b58f-753f-4927-be23-bd4cb0793ac4\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.971826 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htjs4\" (UniqueName: \"kubernetes.io/projected/6119b58f-753f-4927-be23-bd4cb0793ac4-kube-api-access-htjs4\") pod \"6119b58f-753f-4927-be23-bd4cb0793ac4\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.971957 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6119b58f-753f-4927-be23-bd4cb0793ac4-serving-cert\") pod \"6119b58f-753f-4927-be23-bd4cb0793ac4\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.972004 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-config\") pod \"6119b58f-753f-4927-be23-bd4cb0793ac4\" (UID: \"6119b58f-753f-4927-be23-bd4cb0793ac4\") " Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.972943 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-client-ca" (OuterVolumeSpecName: "client-ca") pod "6119b58f-753f-4927-be23-bd4cb0793ac4" (UID: "6119b58f-753f-4927-be23-bd4cb0793ac4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.973002 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-config" (OuterVolumeSpecName: "config") pod "6119b58f-753f-4927-be23-bd4cb0793ac4" (UID: "6119b58f-753f-4927-be23-bd4cb0793ac4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.980239 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6119b58f-753f-4927-be23-bd4cb0793ac4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6119b58f-753f-4927-be23-bd4cb0793ac4" (UID: "6119b58f-753f-4927-be23-bd4cb0793ac4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:22 crc kubenswrapper[4903]: I0320 08:26:22.980964 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6119b58f-753f-4927-be23-bd4cb0793ac4-kube-api-access-htjs4" (OuterVolumeSpecName: "kube-api-access-htjs4") pod "6119b58f-753f-4927-be23-bd4cb0793ac4" (UID: "6119b58f-753f-4927-be23-bd4cb0793ac4"). InnerVolumeSpecName "kube-api-access-htjs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.041219 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.073593 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.073862 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6119b58f-753f-4927-be23-bd4cb0793ac4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.073966 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htjs4\" (UniqueName: \"kubernetes.io/projected/6119b58f-753f-4927-be23-bd4cb0793ac4-kube-api-access-htjs4\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.074072 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6119b58f-753f-4927-be23-bd4cb0793ac4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.175639 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-serving-cert\") pod \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.175773 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-client-ca\") pod \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.175821 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-proxy-ca-bundles\") pod \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.175846 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-config\") pod \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.175877 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wcss\" (UniqueName: \"kubernetes.io/projected/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-kube-api-access-4wcss\") pod \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\" (UID: \"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4\") " Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.177275 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" (UID: "63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.177387 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-config" (OuterVolumeSpecName: "config") pod "63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" (UID: "63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.178229 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-client-ca" (OuterVolumeSpecName: "client-ca") pod "63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" (UID: "63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.182910 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-kube-api-access-4wcss" (OuterVolumeSpecName: "kube-api-access-4wcss") pod "63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" (UID: "63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4"). InnerVolumeSpecName "kube-api-access-4wcss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.182945 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" (UID: "63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.277545 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.277868 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.277955 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.278019 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.278098 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wcss\" (UniqueName: \"kubernetes.io/projected/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4-kube-api-access-4wcss\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.434443 4903 generic.go:334] "Generic (PLEG): container finished" podID="6119b58f-753f-4927-be23-bd4cb0793ac4" containerID="806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09" exitCode=0 Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.434571 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.434586 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" event={"ID":"6119b58f-753f-4927-be23-bd4cb0793ac4","Type":"ContainerDied","Data":"806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09"} Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.434737 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b" event={"ID":"6119b58f-753f-4927-be23-bd4cb0793ac4","Type":"ContainerDied","Data":"a6f56857487c5271bbcbbc46dae7b9447ca2b2585ede2b8bb05cd70ef7efd3f5"} Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.434774 4903 scope.go:117] "RemoveContainer" containerID="806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.436538 4903 generic.go:334] "Generic (PLEG): container finished" podID="63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" containerID="e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d" exitCode=0 Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.436656 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.436656 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" event={"ID":"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4","Type":"ContainerDied","Data":"e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d"} Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.436736 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b46f85f64-bqnzm" event={"ID":"63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4","Type":"ContainerDied","Data":"5e4f658da78d21657a7ad8da3a09fa0002c2bfe07228842241630da3436ffc5a"} Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.464608 4903 scope.go:117] "RemoveContainer" containerID="806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09" Mar 20 08:26:23 crc kubenswrapper[4903]: E0320 08:26:23.465821 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09\": container with ID starting with 806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09 not found: ID does not exist" containerID="806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.465887 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09"} err="failed to get container status \"806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09\": rpc error: code = NotFound desc = could not find container \"806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09\": container with ID starting with 806c49bd86455cf1474bf0cb22770c3d64075cbcabb00380098f044ed07fad09 not found: ID does not exist" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.465923 4903 scope.go:117] "RemoveContainer" containerID="e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.482026 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b46f85f64-bqnzm"] Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.488365 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6b46f85f64-bqnzm"] Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.513801 4903 scope.go:117] "RemoveContainer" containerID="e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d" Mar 20 08:26:23 crc kubenswrapper[4903]: E0320 08:26:23.514344 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d\": container with ID starting with e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d not found: ID does not exist" containerID="e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.514401 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d"} err="failed to get container status \"e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d\": rpc error: code = NotFound desc = could not find container \"e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d\": container with ID starting with e48a7eca996110a08a6911ef2216c0a52f6c1f3664433959e45713e9d7fdb29d not found: ID does not exist" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.515868 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" path="/var/lib/kubelet/pods/63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4/volumes" Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.516702 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b"] Mar 20 08:26:23 crc kubenswrapper[4903]: I0320 08:26:23.516759 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd78bd469-q522b"] Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.389685 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bcf46f896-qp59n"] Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.390752 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerName="extract-content" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.390786 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerName="extract-content" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.390823 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerName="extract-content" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.390839 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerName="extract-content" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.390857 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerName="registry-server" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.390872 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerName="registry-server" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.390898 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerName="extract-utilities" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.390915 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerName="extract-utilities" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.390939 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerName="extract-utilities" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.390956 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerName="extract-utilities" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.390978 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerName="extract-content" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.390994 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerName="extract-content" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.391015 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerName="extract-utilities" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391064 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerName="extract-utilities" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.391090 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerName="registry-server" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391106 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerName="registry-server" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.391129 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" containerName="controller-manager" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391142 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" containerName="controller-manager" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.391163 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6119b58f-753f-4927-be23-bd4cb0793ac4" containerName="route-controller-manager" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391183 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6119b58f-753f-4927-be23-bd4cb0793ac4" containerName="route-controller-manager" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.391213 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerName="registry-server" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391227 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerName="registry-server" Mar 20 08:26:24 crc kubenswrapper[4903]: E0320 08:26:24.391246 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f" containerName="oc" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391259 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f" containerName="oc" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391506 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="63a9dcfa-82bc-49a5-bf15-5f5bd88cebe4" containerName="controller-manager" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391531 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="85039fed-a0e7-4cea-834c-930d1c9974a1" containerName="registry-server" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391557 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f" containerName="oc" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391579 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a63c01-6cfe-4d24-835a-4fa810111888" containerName="registry-server" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391597 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6119b58f-753f-4927-be23-bd4cb0793ac4" containerName="route-controller-manager" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.391623 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d209b7b2-53ad-4780-a13e-65d2b0cb5189" containerName="registry-server" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.392314 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n"] Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.392520 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.393682 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.394582 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.395980 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.396226 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.396234 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.396696 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.396727 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.396902 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.396936 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.397090 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.397124 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.397308 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.398856 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.413282 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bcf46f896-qp59n"] Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.414432 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.430234 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n"] Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.500321 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgb8b\" (UniqueName: \"kubernetes.io/projected/270e0b13-6f9e-4405-9728-d9bbfc02074b-kube-api-access-fgb8b\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.500396 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gplh8\" (UniqueName: \"kubernetes.io/projected/418955f1-ca02-4a7a-b422-7cb0a69f72d6-kube-api-access-gplh8\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.500438 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-config\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.500480 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-client-ca\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.500510 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/418955f1-ca02-4a7a-b422-7cb0a69f72d6-serving-cert\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.500533 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/270e0b13-6f9e-4405-9728-d9bbfc02074b-serving-cert\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.500556 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-client-ca\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.500630 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-proxy-ca-bundles\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.500657 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-config\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.602831 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-client-ca\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.602914 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/418955f1-ca02-4a7a-b422-7cb0a69f72d6-serving-cert\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.602958 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/270e0b13-6f9e-4405-9728-d9bbfc02074b-serving-cert\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.602998 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-client-ca\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.603055 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-proxy-ca-bundles\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.603102 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-config\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.603384 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgb8b\" (UniqueName: \"kubernetes.io/projected/270e0b13-6f9e-4405-9728-d9bbfc02074b-kube-api-access-fgb8b\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.603432 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gplh8\" (UniqueName: \"kubernetes.io/projected/418955f1-ca02-4a7a-b422-7cb0a69f72d6-kube-api-access-gplh8\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.603491 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-config\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.606658 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-client-ca\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.607569 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-config\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.607587 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-proxy-ca-bundles\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.608585 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-config\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.609382 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-client-ca\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.611432 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/418955f1-ca02-4a7a-b422-7cb0a69f72d6-serving-cert\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.623226 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gplh8\" (UniqueName: \"kubernetes.io/projected/418955f1-ca02-4a7a-b422-7cb0a69f72d6-kube-api-access-gplh8\") pod \"controller-manager-6bcf46f896-qp59n\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.624661 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/270e0b13-6f9e-4405-9728-d9bbfc02074b-serving-cert\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.630469 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgb8b\" (UniqueName: \"kubernetes.io/projected/270e0b13-6f9e-4405-9728-d9bbfc02074b-kube-api-access-fgb8b\") pod \"route-controller-manager-7fcb66cc6-2cb2n\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.715800 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.727761 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:24 crc kubenswrapper[4903]: I0320 08:26:24.973465 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bcf46f896-qp59n"] Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.033204 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.033610 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.037134 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n"] Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.102977 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.471000 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" event={"ID":"270e0b13-6f9e-4405-9728-d9bbfc02074b","Type":"ContainerStarted","Data":"16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35"} Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.473120 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" event={"ID":"270e0b13-6f9e-4405-9728-d9bbfc02074b","Type":"ContainerStarted","Data":"e4524c90533875c3012962d7698396afc10f86b1c35e43e414d20e86d7ce1b19"} Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.473317 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.473434 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" event={"ID":"418955f1-ca02-4a7a-b422-7cb0a69f72d6","Type":"ContainerStarted","Data":"acda02e5f78dcbc327b05df74924014c67fedb66cf1b45f4ae61a8390b40aca2"} Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.473499 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" event={"ID":"418955f1-ca02-4a7a-b422-7cb0a69f72d6","Type":"ContainerStarted","Data":"a4bd4d6ef1e0af4fdf656947e6a8aaac8cdcbd2dc1e376403708ce03b6caf54d"} Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.496122 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" podStartSLOduration=3.496097959 podStartE2EDuration="3.496097959s" podCreationTimestamp="2026-03-20 08:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:26:25.494507757 +0000 UTC m=+210.711408072" watchObservedRunningTime="2026-03-20 08:26:25.496097959 +0000 UTC m=+210.712998294" Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.501237 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6119b58f-753f-4927-be23-bd4cb0793ac4" path="/var/lib/kubelet/pods/6119b58f-753f-4927-be23-bd4cb0793ac4/volumes" Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.502078 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.523638 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" podStartSLOduration=3.523616572 podStartE2EDuration="3.523616572s" podCreationTimestamp="2026-03-20 08:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:26:25.519956677 +0000 UTC m=+210.736856992" watchObservedRunningTime="2026-03-20 08:26:25.523616572 +0000 UTC m=+210.740516907" Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.538559 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.657004 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:26:25 crc kubenswrapper[4903]: I0320 08:26:25.699328 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:26:26 crc kubenswrapper[4903]: I0320 08:26:26.483594 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:26 crc kubenswrapper[4903]: I0320 08:26:26.494533 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:26 crc kubenswrapper[4903]: I0320 08:26:26.888804 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l45gc"] Mar 20 08:26:27 crc kubenswrapper[4903]: I0320 08:26:27.489705 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l45gc" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerName="registry-server" containerID="cri-o://b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e" gracePeriod=2 Mar 20 08:26:27 crc kubenswrapper[4903]: I0320 08:26:27.967262 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.062709 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-utilities\") pod \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.062816 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sdsc\" (UniqueName: \"kubernetes.io/projected/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-kube-api-access-8sdsc\") pod \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.062924 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-catalog-content\") pod \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\" (UID: \"efb0ecbf-eb11-4834-8e12-668b3b9f64c8\") " Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.064351 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-utilities" (OuterVolumeSpecName: "utilities") pod "efb0ecbf-eb11-4834-8e12-668b3b9f64c8" (UID: "efb0ecbf-eb11-4834-8e12-668b3b9f64c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.070290 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-kube-api-access-8sdsc" (OuterVolumeSpecName: "kube-api-access-8sdsc") pod "efb0ecbf-eb11-4834-8e12-668b3b9f64c8" (UID: "efb0ecbf-eb11-4834-8e12-668b3b9f64c8"). InnerVolumeSpecName "kube-api-access-8sdsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.121230 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efb0ecbf-eb11-4834-8e12-668b3b9f64c8" (UID: "efb0ecbf-eb11-4834-8e12-668b3b9f64c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.164295 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.164334 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sdsc\" (UniqueName: \"kubernetes.io/projected/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-kube-api-access-8sdsc\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.164346 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efb0ecbf-eb11-4834-8e12-668b3b9f64c8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.498772 4903 generic.go:334] "Generic (PLEG): container finished" podID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerID="b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e" exitCode=0 Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.498856 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45gc" event={"ID":"efb0ecbf-eb11-4834-8e12-668b3b9f64c8","Type":"ContainerDied","Data":"b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e"} Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.498935 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l45gc" event={"ID":"efb0ecbf-eb11-4834-8e12-668b3b9f64c8","Type":"ContainerDied","Data":"de3aa84a566c17025d39801f16bec43d719352f331a29e75419a16913103438e"} Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.498946 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l45gc" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.498975 4903 scope.go:117] "RemoveContainer" containerID="b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.517268 4903 scope.go:117] "RemoveContainer" containerID="cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.534131 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l45gc"] Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.536399 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l45gc"] Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.566100 4903 scope.go:117] "RemoveContainer" containerID="1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.585391 4903 scope.go:117] "RemoveContainer" containerID="b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e" Mar 20 08:26:28 crc kubenswrapper[4903]: E0320 08:26:28.585864 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e\": container with ID starting with b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e not found: ID does not exist" containerID="b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.585938 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e"} err="failed to get container status \"b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e\": rpc error: code = NotFound desc = could not find container \"b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e\": container with ID starting with b7dfbbffaa7943814566d257866f8482e119b7902ebb78c89d2303cc31e7378e not found: ID does not exist" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.585984 4903 scope.go:117] "RemoveContainer" containerID="cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0" Mar 20 08:26:28 crc kubenswrapper[4903]: E0320 08:26:28.586394 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0\": container with ID starting with cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0 not found: ID does not exist" containerID="cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.586439 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0"} err="failed to get container status \"cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0\": rpc error: code = NotFound desc = could not find container \"cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0\": container with ID starting with cb440879ceb0866041e2746c980a3b337d8e90933832b0daf63a114c8bd26ac0 not found: ID does not exist" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.586465 4903 scope.go:117] "RemoveContainer" containerID="1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e" Mar 20 08:26:28 crc kubenswrapper[4903]: E0320 08:26:28.586882 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e\": container with ID starting with 1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e not found: ID does not exist" containerID="1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e" Mar 20 08:26:28 crc kubenswrapper[4903]: I0320 08:26:28.586957 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e"} err="failed to get container status \"1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e\": rpc error: code = NotFound desc = could not find container \"1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e\": container with ID starting with 1748ccefdb91a18ac39fccc2cc07f9f6d40fb5adb63bfabd438362b01b8d6a9e not found: ID does not exist" Mar 20 08:26:29 crc kubenswrapper[4903]: I0320 08:26:29.274143 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:29 crc kubenswrapper[4903]: I0320 08:26:29.509926 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" path="/var/lib/kubelet/pods/efb0ecbf-eb11-4834-8e12-668b3b9f64c8/volumes" Mar 20 08:26:29 crc kubenswrapper[4903]: I0320 08:26:29.655351 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" podUID="82c317fb-5d87-47b3-849c-58b0bab4d3ef" containerName="oauth-openshift" containerID="cri-o://2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905" gracePeriod=15 Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.159494 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.299466 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-login\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.300347 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-error\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.301307 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-dir\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.301379 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-service-ca\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.301447 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-provider-selection\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.302248 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-cliconfig\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.302298 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-policies\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.302311 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.302374 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-session\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.302426 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-trusted-ca-bundle\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.302479 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-serving-cert\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.302541 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-idp-0-file-data\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.303233 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g78nn\" (UniqueName: \"kubernetes.io/projected/82c317fb-5d87-47b3-849c-58b0bab4d3ef-kube-api-access-g78nn\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.303279 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-router-certs\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.303321 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-ocp-branding-template\") pod \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\" (UID: \"82c317fb-5d87-47b3-849c-58b0bab4d3ef\") " Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.303744 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.303233 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.301414 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.303920 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.306060 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.308096 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.311730 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c317fb-5d87-47b3-849c-58b0bab4d3ef-kube-api-access-g78nn" (OuterVolumeSpecName: "kube-api-access-g78nn") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "kube-api-access-g78nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.311872 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.312174 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.313784 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.313964 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.314263 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.314365 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.315127 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "82c317fb-5d87-47b3-849c-58b0bab4d3ef" (UID: "82c317fb-5d87-47b3-849c-58b0bab4d3ef"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405771 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405816 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405840 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405856 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405872 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405886 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g78nn\" (UniqueName: \"kubernetes.io/projected/82c317fb-5d87-47b3-849c-58b0bab4d3ef-kube-api-access-g78nn\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405900 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405916 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405926 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405938 4903 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405950 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405960 4903 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.405971 4903 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/82c317fb-5d87-47b3-849c-58b0bab4d3ef-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.523461 4903 generic.go:334] "Generic (PLEG): container finished" podID="82c317fb-5d87-47b3-849c-58b0bab4d3ef" containerID="2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905" exitCode=0 Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.523539 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" event={"ID":"82c317fb-5d87-47b3-849c-58b0bab4d3ef","Type":"ContainerDied","Data":"2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905"} Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.523585 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" event={"ID":"82c317fb-5d87-47b3-849c-58b0bab4d3ef","Type":"ContainerDied","Data":"ff8ab62dfbbb5883b56be3fa4ade580d22532bfa4158fdd5493bf5701843757e"} Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.523611 4903 scope.go:117] "RemoveContainer" containerID="2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.523800 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-5dpx6" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.564917 4903 scope.go:117] "RemoveContainer" containerID="2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905" Mar 20 08:26:30 crc kubenswrapper[4903]: E0320 08:26:30.565772 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905\": container with ID starting with 2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905 not found: ID does not exist" containerID="2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.565953 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905"} err="failed to get container status \"2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905\": rpc error: code = NotFound desc = could not find container \"2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905\": container with ID starting with 2d41c5e73d234158ea80c0db27197196d24cefcb41790a3b594f76501debe905 not found: ID does not exist" Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.573859 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx6"] Mar 20 08:26:30 crc kubenswrapper[4903]: I0320 08:26:30.577281 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-5dpx6"] Mar 20 08:26:31 crc kubenswrapper[4903]: I0320 08:26:31.499925 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82c317fb-5d87-47b3-849c-58b0bab4d3ef" path="/var/lib/kubelet/pods/82c317fb-5d87-47b3-849c-58b0bab4d3ef/volumes" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.399211 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6499b46898-8b8zg"] Mar 20 08:26:35 crc kubenswrapper[4903]: E0320 08:26:35.399781 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerName="registry-server" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.399797 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerName="registry-server" Mar 20 08:26:35 crc kubenswrapper[4903]: E0320 08:26:35.399806 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c317fb-5d87-47b3-849c-58b0bab4d3ef" containerName="oauth-openshift" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.399812 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c317fb-5d87-47b3-849c-58b0bab4d3ef" containerName="oauth-openshift" Mar 20 08:26:35 crc kubenswrapper[4903]: E0320 08:26:35.399829 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerName="extract-utilities" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.399835 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerName="extract-utilities" Mar 20 08:26:35 crc kubenswrapper[4903]: E0320 08:26:35.399846 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerName="extract-content" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.399854 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerName="extract-content" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.399966 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c317fb-5d87-47b3-849c-58b0bab4d3ef" containerName="oauth-openshift" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.399984 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb0ecbf-eb11-4834-8e12-668b3b9f64c8" containerName="registry-server" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.400457 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.403199 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.403535 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.403563 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.403570 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.405250 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.405539 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.405571 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.405691 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.405726 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.406018 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.414433 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.414546 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.420400 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.427603 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.429754 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6499b46898-8b8zg"] Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.434294 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503228 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503283 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503330 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-audit-policies\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503427 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-audit-dir\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503551 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-service-ca\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503585 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503624 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-template-login\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503672 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-template-error\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503714 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503752 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-session\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503780 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503811 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-router-certs\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503844 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcffs\" (UniqueName: \"kubernetes.io/projected/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-kube-api-access-xcffs\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.503883 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.604803 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-router-certs\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.604859 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcffs\" (UniqueName: \"kubernetes.io/projected/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-kube-api-access-xcffs\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.604885 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.604930 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.604960 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.605013 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-audit-policies\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.605049 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-audit-dir\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.605078 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.605097 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-service-ca\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.605115 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-template-login\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.605134 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-template-error\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.605184 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.605207 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-session\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.605226 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.605447 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-audit-dir\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.606806 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.607245 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.607829 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-audit-policies\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.609858 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-service-ca\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.611198 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-router-certs\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.612009 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-session\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.614330 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.615404 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-template-error\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.615745 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.615905 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.616531 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-template-login\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.623821 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.628421 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcffs\" (UniqueName: \"kubernetes.io/projected/91451dcb-28a3-47c5-bb7a-9e0d00e4a75c-kube-api-access-xcffs\") pod \"oauth-openshift-6499b46898-8b8zg\" (UID: \"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c\") " pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:35 crc kubenswrapper[4903]: I0320 08:26:35.721912 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:36 crc kubenswrapper[4903]: I0320 08:26:36.292952 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6499b46898-8b8zg"] Mar 20 08:26:36 crc kubenswrapper[4903]: I0320 08:26:36.569612 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" event={"ID":"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c","Type":"ContainerStarted","Data":"826de20f24143b40ef278f1a208b8ac3b1143be462ea4c2ff84ebf78870ab712"} Mar 20 08:26:37 crc kubenswrapper[4903]: I0320 08:26:37.581962 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" event={"ID":"91451dcb-28a3-47c5-bb7a-9e0d00e4a75c","Type":"ContainerStarted","Data":"f728d058a6dc5ae3ed8d43673d1962272621fb4d81adfb2fb04f12757ebce208"} Mar 20 08:26:37 crc kubenswrapper[4903]: I0320 08:26:37.582575 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:37 crc kubenswrapper[4903]: I0320 08:26:37.591575 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" Mar 20 08:26:37 crc kubenswrapper[4903]: I0320 08:26:37.661340 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6499b46898-8b8zg" podStartSLOduration=33.661307034000004 podStartE2EDuration="33.661307034s" podCreationTimestamp="2026-03-20 08:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:26:37.619326219 +0000 UTC m=+222.836226574" watchObservedRunningTime="2026-03-20 08:26:37.661307034 +0000 UTC m=+222.878207389" Mar 20 08:26:42 crc kubenswrapper[4903]: I0320 08:26:42.425890 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bcf46f896-qp59n"] Mar 20 08:26:42 crc kubenswrapper[4903]: I0320 08:26:42.426753 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" podUID="418955f1-ca02-4a7a-b422-7cb0a69f72d6" containerName="controller-manager" containerID="cri-o://acda02e5f78dcbc327b05df74924014c67fedb66cf1b45f4ae61a8390b40aca2" gracePeriod=30 Mar 20 08:26:42 crc kubenswrapper[4903]: I0320 08:26:42.516646 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n"] Mar 20 08:26:42 crc kubenswrapper[4903]: I0320 08:26:42.516884 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" podUID="270e0b13-6f9e-4405-9728-d9bbfc02074b" containerName="route-controller-manager" containerID="cri-o://16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35" gracePeriod=30 Mar 20 08:26:42 crc kubenswrapper[4903]: I0320 08:26:42.625461 4903 generic.go:334] "Generic (PLEG): container finished" podID="418955f1-ca02-4a7a-b422-7cb0a69f72d6" containerID="acda02e5f78dcbc327b05df74924014c67fedb66cf1b45f4ae61a8390b40aca2" exitCode=0 Mar 20 08:26:42 crc kubenswrapper[4903]: I0320 08:26:42.625556 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" event={"ID":"418955f1-ca02-4a7a-b422-7cb0a69f72d6","Type":"ContainerDied","Data":"acda02e5f78dcbc327b05df74924014c67fedb66cf1b45f4ae61a8390b40aca2"} Mar 20 08:26:42 crc kubenswrapper[4903]: I0320 08:26:42.998669 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.004408 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.138596 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-config\") pod \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.138667 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-config\") pod \"270e0b13-6f9e-4405-9728-d9bbfc02074b\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.138772 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gplh8\" (UniqueName: \"kubernetes.io/projected/418955f1-ca02-4a7a-b422-7cb0a69f72d6-kube-api-access-gplh8\") pod \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.138817 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgb8b\" (UniqueName: \"kubernetes.io/projected/270e0b13-6f9e-4405-9728-d9bbfc02074b-kube-api-access-fgb8b\") pod \"270e0b13-6f9e-4405-9728-d9bbfc02074b\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.138905 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/418955f1-ca02-4a7a-b422-7cb0a69f72d6-serving-cert\") pod \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.138951 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/270e0b13-6f9e-4405-9728-d9bbfc02074b-serving-cert\") pod \"270e0b13-6f9e-4405-9728-d9bbfc02074b\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.138992 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-client-ca\") pod \"270e0b13-6f9e-4405-9728-d9bbfc02074b\" (UID: \"270e0b13-6f9e-4405-9728-d9bbfc02074b\") " Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.139098 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-proxy-ca-bundles\") pod \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.139133 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-client-ca\") pod \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\" (UID: \"418955f1-ca02-4a7a-b422-7cb0a69f72d6\") " Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.140176 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-client-ca" (OuterVolumeSpecName: "client-ca") pod "418955f1-ca02-4a7a-b422-7cb0a69f72d6" (UID: "418955f1-ca02-4a7a-b422-7cb0a69f72d6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.140259 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-config" (OuterVolumeSpecName: "config") pod "418955f1-ca02-4a7a-b422-7cb0a69f72d6" (UID: "418955f1-ca02-4a7a-b422-7cb0a69f72d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.140482 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-config" (OuterVolumeSpecName: "config") pod "270e0b13-6f9e-4405-9728-d9bbfc02074b" (UID: "270e0b13-6f9e-4405-9728-d9bbfc02074b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.141106 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-client-ca" (OuterVolumeSpecName: "client-ca") pod "270e0b13-6f9e-4405-9728-d9bbfc02074b" (UID: "270e0b13-6f9e-4405-9728-d9bbfc02074b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.141366 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "418955f1-ca02-4a7a-b422-7cb0a69f72d6" (UID: "418955f1-ca02-4a7a-b422-7cb0a69f72d6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.145849 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270e0b13-6f9e-4405-9728-d9bbfc02074b-kube-api-access-fgb8b" (OuterVolumeSpecName: "kube-api-access-fgb8b") pod "270e0b13-6f9e-4405-9728-d9bbfc02074b" (UID: "270e0b13-6f9e-4405-9728-d9bbfc02074b"). InnerVolumeSpecName "kube-api-access-fgb8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.146297 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418955f1-ca02-4a7a-b422-7cb0a69f72d6-kube-api-access-gplh8" (OuterVolumeSpecName: "kube-api-access-gplh8") pod "418955f1-ca02-4a7a-b422-7cb0a69f72d6" (UID: "418955f1-ca02-4a7a-b422-7cb0a69f72d6"). InnerVolumeSpecName "kube-api-access-gplh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.149262 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418955f1-ca02-4a7a-b422-7cb0a69f72d6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "418955f1-ca02-4a7a-b422-7cb0a69f72d6" (UID: "418955f1-ca02-4a7a-b422-7cb0a69f72d6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.155772 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270e0b13-6f9e-4405-9728-d9bbfc02074b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "270e0b13-6f9e-4405-9728-d9bbfc02074b" (UID: "270e0b13-6f9e-4405-9728-d9bbfc02074b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.240856 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/418955f1-ca02-4a7a-b422-7cb0a69f72d6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.240904 4903 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/270e0b13-6f9e-4405-9728-d9bbfc02074b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.240917 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.240927 4903 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.240941 4903 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.240978 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/418955f1-ca02-4a7a-b422-7cb0a69f72d6-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.240990 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/270e0b13-6f9e-4405-9728-d9bbfc02074b-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.241003 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gplh8\" (UniqueName: \"kubernetes.io/projected/418955f1-ca02-4a7a-b422-7cb0a69f72d6-kube-api-access-gplh8\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.241014 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgb8b\" (UniqueName: \"kubernetes.io/projected/270e0b13-6f9e-4405-9728-d9bbfc02074b-kube-api-access-fgb8b\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.637187 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.637175 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcf46f896-qp59n" event={"ID":"418955f1-ca02-4a7a-b422-7cb0a69f72d6","Type":"ContainerDied","Data":"a4bd4d6ef1e0af4fdf656947e6a8aaac8cdcbd2dc1e376403708ce03b6caf54d"} Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.637386 4903 scope.go:117] "RemoveContainer" containerID="acda02e5f78dcbc327b05df74924014c67fedb66cf1b45f4ae61a8390b40aca2" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.643148 4903 generic.go:334] "Generic (PLEG): container finished" podID="270e0b13-6f9e-4405-9728-d9bbfc02074b" containerID="16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35" exitCode=0 Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.643207 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" event={"ID":"270e0b13-6f9e-4405-9728-d9bbfc02074b","Type":"ContainerDied","Data":"16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35"} Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.643244 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" event={"ID":"270e0b13-6f9e-4405-9728-d9bbfc02074b","Type":"ContainerDied","Data":"e4524c90533875c3012962d7698396afc10f86b1c35e43e414d20e86d7ce1b19"} Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.643366 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.675317 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bcf46f896-qp59n"] Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.681470 4903 scope.go:117] "RemoveContainer" containerID="16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.682475 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bcf46f896-qp59n"] Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.695364 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n"] Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.701424 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcb66cc6-2cb2n"] Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.720967 4903 scope.go:117] "RemoveContainer" containerID="16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.721846 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35\": container with ID starting with 16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35 not found: ID does not exist" containerID="16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.721901 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35"} err="failed to get container status \"16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35\": rpc error: code = NotFound desc = could not find container \"16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35\": container with ID starting with 16c5daaca2bb7728481da4fc02621d27d0bbf5fc7af31587a055c6fd96ea1f35 not found: ID does not exist" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.830962 4903 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.831641 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533" gracePeriod=15 Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.831726 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b" gracePeriod=15 Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.831872 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97" gracePeriod=15 Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.831970 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda" gracePeriod=15 Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.831867 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374" gracePeriod=15 Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.835353 4903 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.835872 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.836015 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.836176 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="418955f1-ca02-4a7a-b422-7cb0a69f72d6" containerName="controller-manager" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.836289 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="418955f1-ca02-4a7a-b422-7cb0a69f72d6" containerName="controller-manager" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.836399 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.836508 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.836619 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.836724 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.836824 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.836920 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.837022 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270e0b13-6f9e-4405-9728-d9bbfc02074b" containerName="route-controller-manager" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.837163 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="270e0b13-6f9e-4405-9728-d9bbfc02074b" containerName="route-controller-manager" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.837292 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.837439 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.837582 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.837696 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.837821 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.837949 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.838136 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.838297 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.838449 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.838598 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.838727 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.838867 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.839190 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.839335 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="418955f1-ca02-4a7a-b422-7cb0a69f72d6" containerName="controller-manager" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.839508 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.839657 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.839806 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="270e0b13-6f9e-4405-9728-d9bbfc02074b" containerName="route-controller-manager" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.839957 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.840134 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.840279 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.840409 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.840511 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.840630 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.840739 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 08:26:43 crc kubenswrapper[4903]: E0320 08:26:43.840989 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.841116 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.842710 4903 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.848217 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.853625 4903 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.953333 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.953451 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.953493 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.953537 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.953585 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.953770 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.953979 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:43 crc kubenswrapper[4903]: I0320 08:26:43.954236 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.055764 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.055984 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056222 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056290 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056418 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056514 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056601 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056671 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056601 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056701 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056912 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056988 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.056559 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.057099 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.057133 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.057128 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:44 crc kubenswrapper[4903]: E0320 08:26:44.119291 4903 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:44 crc kubenswrapper[4903]: E0320 08:26:44.120094 4903 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:44 crc kubenswrapper[4903]: E0320 08:26:44.120676 4903 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:44 crc kubenswrapper[4903]: E0320 08:26:44.121344 4903 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:44 crc kubenswrapper[4903]: E0320 08:26:44.121728 4903 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.121781 4903 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 08:26:44 crc kubenswrapper[4903]: E0320 08:26:44.122593 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="200ms" Mar 20 08:26:44 crc kubenswrapper[4903]: E0320 08:26:44.324452 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="400ms" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.659728 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.662025 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.663368 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b" exitCode=0 Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.663416 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374" exitCode=0 Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.663433 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda" exitCode=0 Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.663447 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97" exitCode=2 Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.663560 4903 scope.go:117] "RemoveContainer" containerID="f756aa5d647538ea78b646d483b3c6e7943baac8d492132d04743f5f99f4ddf4" Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.670075 4903 generic.go:334] "Generic (PLEG): container finished" podID="12081c8c-e4be-4b92-8e36-e39afc95015a" containerID="1406d6ad56c694136d708a44b3a0e75399cb8dcb92497f6d42d6e9c8ca47de44" exitCode=0 Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.670103 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"12081c8c-e4be-4b92-8e36-e39afc95015a","Type":"ContainerDied","Data":"1406d6ad56c694136d708a44b3a0e75399cb8dcb92497f6d42d6e9c8ca47de44"} Mar 20 08:26:44 crc kubenswrapper[4903]: I0320 08:26:44.671900 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:44 crc kubenswrapper[4903]: E0320 08:26:44.726100 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="800ms" Mar 20 08:26:45 crc kubenswrapper[4903]: I0320 08:26:45.497635 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:45 crc kubenswrapper[4903]: I0320 08:26:45.504591 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="270e0b13-6f9e-4405-9728-d9bbfc02074b" path="/var/lib/kubelet/pods/270e0b13-6f9e-4405-9728-d9bbfc02074b/volumes" Mar 20 08:26:45 crc kubenswrapper[4903]: I0320 08:26:45.505811 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="418955f1-ca02-4a7a-b422-7cb0a69f72d6" path="/var/lib/kubelet/pods/418955f1-ca02-4a7a-b422-7cb0a69f72d6/volumes" Mar 20 08:26:45 crc kubenswrapper[4903]: E0320 08:26:45.528076 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="1.6s" Mar 20 08:26:45 crc kubenswrapper[4903]: I0320 08:26:45.682576 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.076794 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.078132 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.200900 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12081c8c-e4be-4b92-8e36-e39afc95015a-kube-api-access\") pod \"12081c8c-e4be-4b92-8e36-e39afc95015a\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.202135 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-kubelet-dir\") pod \"12081c8c-e4be-4b92-8e36-e39afc95015a\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.202188 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-var-lock\") pod \"12081c8c-e4be-4b92-8e36-e39afc95015a\" (UID: \"12081c8c-e4be-4b92-8e36-e39afc95015a\") " Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.202232 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "12081c8c-e4be-4b92-8e36-e39afc95015a" (UID: "12081c8c-e4be-4b92-8e36-e39afc95015a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.202361 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-var-lock" (OuterVolumeSpecName: "var-lock") pod "12081c8c-e4be-4b92-8e36-e39afc95015a" (UID: "12081c8c-e4be-4b92-8e36-e39afc95015a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.202814 4903 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.202851 4903 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/12081c8c-e4be-4b92-8e36-e39afc95015a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.208491 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12081c8c-e4be-4b92-8e36-e39afc95015a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "12081c8c-e4be-4b92-8e36-e39afc95015a" (UID: "12081c8c-e4be-4b92-8e36-e39afc95015a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.232266 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.233407 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.234368 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.234875 4903 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.304632 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12081c8c-e4be-4b92-8e36-e39afc95015a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.405767 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.405902 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.405972 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.406007 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.406135 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.406188 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.406463 4903 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.406494 4903 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.406521 4903 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.705699 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.706463 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533" exitCode=0 Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.706595 4903 scope.go:117] "RemoveContainer" containerID="a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.706738 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.708655 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"12081c8c-e4be-4b92-8e36-e39afc95015a","Type":"ContainerDied","Data":"1da2af238f1f27766f04d15b81b9d78e04f3946e3f7f83616d1c38b5b448939f"} Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.708710 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1da2af238f1f27766f04d15b81b9d78e04f3946e3f7f83616d1c38b5b448939f" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.708758 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.736319 4903 scope.go:117] "RemoveContainer" containerID="ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.739452 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.740218 4903 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.740849 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.741266 4903 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.764743 4903 scope.go:117] "RemoveContainer" containerID="84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.786319 4903 scope.go:117] "RemoveContainer" containerID="95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.809957 4903 scope.go:117] "RemoveContainer" containerID="8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.838702 4903 scope.go:117] "RemoveContainer" containerID="17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.877639 4903 scope.go:117] "RemoveContainer" containerID="a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b" Mar 20 08:26:46 crc kubenswrapper[4903]: E0320 08:26:46.878451 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b\": container with ID starting with a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b not found: ID does not exist" containerID="a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.878524 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b"} err="failed to get container status \"a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b\": rpc error: code = NotFound desc = could not find container \"a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b\": container with ID starting with a9201ff62735663a1211ec23c141098838c105950d0ff00ae94d81e98de98b7b not found: ID does not exist" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.878575 4903 scope.go:117] "RemoveContainer" containerID="ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374" Mar 20 08:26:46 crc kubenswrapper[4903]: E0320 08:26:46.879254 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374\": container with ID starting with ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374 not found: ID does not exist" containerID="ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.879318 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374"} err="failed to get container status \"ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374\": rpc error: code = NotFound desc = could not find container \"ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374\": container with ID starting with ff47d5e9db1398c42b35dce1fbcca05073c8e28b5c7187174de7f355065ec374 not found: ID does not exist" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.879369 4903 scope.go:117] "RemoveContainer" containerID="84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda" Mar 20 08:26:46 crc kubenswrapper[4903]: E0320 08:26:46.880083 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda\": container with ID starting with 84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda not found: ID does not exist" containerID="84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.880133 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda"} err="failed to get container status \"84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda\": rpc error: code = NotFound desc = could not find container \"84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda\": container with ID starting with 84dc5fbce1c40b3a5ff4df4082324127ad8c9fb05387581a62eb218551dfdcda not found: ID does not exist" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.880162 4903 scope.go:117] "RemoveContainer" containerID="95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97" Mar 20 08:26:46 crc kubenswrapper[4903]: E0320 08:26:46.881186 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97\": container with ID starting with 95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97 not found: ID does not exist" containerID="95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.881228 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97"} err="failed to get container status \"95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97\": rpc error: code = NotFound desc = could not find container \"95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97\": container with ID starting with 95ed5ac9613b849264d6577a5d37580c9b674adfe07c5d93b5a34251dab97a97 not found: ID does not exist" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.881253 4903 scope.go:117] "RemoveContainer" containerID="8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533" Mar 20 08:26:46 crc kubenswrapper[4903]: E0320 08:26:46.881782 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533\": container with ID starting with 8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533 not found: ID does not exist" containerID="8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.881822 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533"} err="failed to get container status \"8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533\": rpc error: code = NotFound desc = could not find container \"8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533\": container with ID starting with 8ef0faf48a64d1f9ab296076561f444dac491f6a937100dc745062799ac14533 not found: ID does not exist" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.881847 4903 scope.go:117] "RemoveContainer" containerID="17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41" Mar 20 08:26:46 crc kubenswrapper[4903]: E0320 08:26:46.882508 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\": container with ID starting with 17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41 not found: ID does not exist" containerID="17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41" Mar 20 08:26:46 crc kubenswrapper[4903]: I0320 08:26:46.882559 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41"} err="failed to get container status \"17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\": rpc error: code = NotFound desc = could not find container \"17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41\": container with ID starting with 17769cf4064c962bbfd92f2b8e377ba2acb97a93410e58e3e9c07f6aabd1ac41 not found: ID does not exist" Mar 20 08:26:47 crc kubenswrapper[4903]: E0320 08:26:47.129732 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="3.2s" Mar 20 08:26:47 crc kubenswrapper[4903]: I0320 08:26:47.507475 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 08:26:48 crc kubenswrapper[4903]: E0320 08:26:48.883632 4903 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:48 crc kubenswrapper[4903]: I0320 08:26:48.884424 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:48 crc kubenswrapper[4903]: E0320 08:26:48.912323 4903 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.164:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e7f3e545e53ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:26:48.910721964 +0000 UTC m=+234.127622279,LastTimestamp:2026-03-20 08:26:48.910721964 +0000 UTC m=+234.127622279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:26:49 crc kubenswrapper[4903]: I0320 08:26:49.738792 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859"} Mar 20 08:26:49 crc kubenswrapper[4903]: I0320 08:26:49.739393 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"020f692cfbd90abd6fb1fa5ecd87c2a22d78d3cf3a59e6a3dd150ad6906829e9"} Mar 20 08:26:49 crc kubenswrapper[4903]: I0320 08:26:49.740928 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:49 crc kubenswrapper[4903]: E0320 08:26:49.740924 4903 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:26:50 crc kubenswrapper[4903]: E0320 08:26:50.331244 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="6.4s" Mar 20 08:26:55 crc kubenswrapper[4903]: I0320 08:26:55.497627 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:55 crc kubenswrapper[4903]: I0320 08:26:55.497735 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:55 crc kubenswrapper[4903]: I0320 08:26:55.500111 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:55 crc kubenswrapper[4903]: I0320 08:26:55.520740 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:26:55 crc kubenswrapper[4903]: I0320 08:26:55.520801 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:26:55 crc kubenswrapper[4903]: E0320 08:26:55.521667 4903 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:55 crc kubenswrapper[4903]: I0320 08:26:55.522932 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:55 crc kubenswrapper[4903]: W0320 08:26:55.561202 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-8877356a004518748544cbd3d26942dc0adea45726b1a4b6b671c4c6ca3f916a WatchSource:0}: Error finding container 8877356a004518748544cbd3d26942dc0adea45726b1a4b6b671c4c6ca3f916a: Status 404 returned error can't find the container with id 8877356a004518748544cbd3d26942dc0adea45726b1a4b6b671c4c6ca3f916a Mar 20 08:26:55 crc kubenswrapper[4903]: E0320 08:26:55.582308 4903 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" volumeName="registry-storage" Mar 20 08:26:55 crc kubenswrapper[4903]: I0320 08:26:55.789308 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8877356a004518748544cbd3d26942dc0adea45726b1a4b6b671c4c6ca3f916a"} Mar 20 08:26:56 crc kubenswrapper[4903]: E0320 08:26:56.733329 4903 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.164:6443: connect: connection refused" interval="7s" Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.796276 4903 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d2b8801783400e9e09580d2fcd3b2807fab67eb842eb15bbbf329d80636c7945" exitCode=0 Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.796370 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d2b8801783400e9e09580d2fcd3b2807fab67eb842eb15bbbf329d80636c7945"} Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.797421 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.797452 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.797972 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:56 crc kubenswrapper[4903]: E0320 08:26:56.798181 4903 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.802015 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.803494 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.803543 4903 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="62d8b4b3660cba4aedff2babcee4801e661d002a87053e009eab4d967a7a8746" exitCode=1 Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.803583 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"62d8b4b3660cba4aedff2babcee4801e661d002a87053e009eab4d967a7a8746"} Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.803981 4903 scope.go:117] "RemoveContainer" containerID="62d8b4b3660cba4aedff2babcee4801e661d002a87053e009eab4d967a7a8746" Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.804403 4903 status_manager.go:851] "Failed to get status for pod" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:56 crc kubenswrapper[4903]: I0320 08:26:56.804823 4903 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.164:6443: connect: connection refused" Mar 20 08:26:57 crc kubenswrapper[4903]: I0320 08:26:57.823646 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 08:26:57 crc kubenswrapper[4903]: I0320 08:26:57.828169 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 08:26:57 crc kubenswrapper[4903]: I0320 08:26:57.828296 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"de6866c31811efb15f8ad4fe233554a02aaad9f849c7d3ab2ea50a258223e105"} Mar 20 08:26:57 crc kubenswrapper[4903]: I0320 08:26:57.831497 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7c7dbdc1b91ffab0314d0ec9771837a0476ffbc00de4a0e2324e16c76fdbc927"} Mar 20 08:26:57 crc kubenswrapper[4903]: I0320 08:26:57.831526 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"256a2b1a56eb0422bfb8e477c3b97dc0d5fc5aabfc0c90ad7a4351aca027ffb1"} Mar 20 08:26:57 crc kubenswrapper[4903]: I0320 08:26:57.831538 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a6699c50896ef50d6345615d27dd6c7263e62d26086c52278900e091ddbda512"} Mar 20 08:26:58 crc kubenswrapper[4903]: I0320 08:26:58.826960 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:26:58 crc kubenswrapper[4903]: I0320 08:26:58.848576 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3471c0beb86b4664a6d27b74a29254e9b37aa9124f0c5d1e1c81e518054c70e4"} Mar 20 08:26:58 crc kubenswrapper[4903]: I0320 08:26:58.849154 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fb793a2a2024b1488b93052af5fc43c988bb0f1db73527d8a4382ecaa1da2cc6"} Mar 20 08:26:58 crc kubenswrapper[4903]: I0320 08:26:58.849441 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:26:58 crc kubenswrapper[4903]: I0320 08:26:58.848680 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:26:58 crc kubenswrapper[4903]: I0320 08:26:58.849873 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:27:00 crc kubenswrapper[4903]: I0320 08:27:00.523121 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:27:00 crc kubenswrapper[4903]: I0320 08:27:00.524487 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:27:00 crc kubenswrapper[4903]: I0320 08:27:00.533865 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:27:03 crc kubenswrapper[4903]: I0320 08:27:03.693635 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:27:03 crc kubenswrapper[4903]: I0320 08:27:03.701080 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:27:03 crc kubenswrapper[4903]: I0320 08:27:03.863515 4903 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:27:03 crc kubenswrapper[4903]: I0320 08:27:03.908146 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:27:03 crc kubenswrapper[4903]: I0320 08:27:03.908205 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:27:03 crc kubenswrapper[4903]: I0320 08:27:03.916575 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:27:04 crc kubenswrapper[4903]: I0320 08:27:04.912902 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:27:04 crc kubenswrapper[4903]: I0320 08:27:04.913327 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:27:05 crc kubenswrapper[4903]: I0320 08:27:05.515273 4903 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e52d3ee1-c7a2-4050-8cd1-567404f7e694" Mar 20 08:27:08 crc kubenswrapper[4903]: I0320 08:27:08.834357 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:27:14 crc kubenswrapper[4903]: I0320 08:27:14.490740 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 08:27:14 crc kubenswrapper[4903]: I0320 08:27:14.784859 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 08:27:15 crc kubenswrapper[4903]: I0320 08:27:15.002574 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 08:27:15 crc kubenswrapper[4903]: I0320 08:27:15.179842 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 08:27:15 crc kubenswrapper[4903]: I0320 08:27:15.245327 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 08:27:15 crc kubenswrapper[4903]: I0320 08:27:15.250779 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 08:27:15 crc kubenswrapper[4903]: I0320 08:27:15.488412 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 08:27:15 crc kubenswrapper[4903]: I0320 08:27:15.890413 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 08:27:15 crc kubenswrapper[4903]: I0320 08:27:15.892706 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 08:27:15 crc kubenswrapper[4903]: I0320 08:27:15.995711 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 08:27:16 crc kubenswrapper[4903]: I0320 08:27:16.146445 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 08:27:16 crc kubenswrapper[4903]: I0320 08:27:16.226910 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 08:27:16 crc kubenswrapper[4903]: I0320 08:27:16.299134 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 08:27:16 crc kubenswrapper[4903]: I0320 08:27:16.358591 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 08:27:16 crc kubenswrapper[4903]: I0320 08:27:16.471732 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 08:27:16 crc kubenswrapper[4903]: I0320 08:27:16.728382 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 08:27:16 crc kubenswrapper[4903]: I0320 08:27:16.739786 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 08:27:16 crc kubenswrapper[4903]: I0320 08:27:16.775794 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 08:27:16 crc kubenswrapper[4903]: I0320 08:27:16.815595 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 08:27:16 crc kubenswrapper[4903]: I0320 08:27:16.944831 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.262601 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.381280 4903 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.391025 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.391174 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c657cbc6d-8275s","openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 08:27:17 crc kubenswrapper[4903]: E0320 08:27:17.391559 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" containerName="installer" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.391598 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" containerName="installer" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.391786 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="12081c8c-e4be-4b92-8e36-e39afc95015a" containerName="installer" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.392396 4903 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.392579 4903 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e53791c9-7f9f-4ce5-8c13-29786721b9e7" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.393462 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.395146 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.398727 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.399525 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.403948 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.404063 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.404507 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.404705 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.408221 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.408541 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.408834 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.409132 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.410263 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.410507 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.410840 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.420602 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.448202 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.474263 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.474175761 podStartE2EDuration="14.474175761s" podCreationTimestamp="2026-03-20 08:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:27:17.472770854 +0000 UTC m=+262.689671209" watchObservedRunningTime="2026-03-20 08:27:17.474175761 +0000 UTC m=+262.691076106" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.534210 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gl26\" (UniqueName: \"kubernetes.io/projected/c9c27545-75ba-4bce-b870-96b3c4050114-kube-api-access-6gl26\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.534804 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q486d\" (UniqueName: \"kubernetes.io/projected/cf764085-fbaa-42f8-b8ab-960398ad5c25-kube-api-access-q486d\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.535121 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf764085-fbaa-42f8-b8ab-960398ad5c25-client-ca\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.535308 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf764085-fbaa-42f8-b8ab-960398ad5c25-proxy-ca-bundles\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.535499 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c27545-75ba-4bce-b870-96b3c4050114-config\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.536873 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9c27545-75ba-4bce-b870-96b3c4050114-client-ca\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.537009 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c27545-75ba-4bce-b870-96b3c4050114-serving-cert\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.537179 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf764085-fbaa-42f8-b8ab-960398ad5c25-serving-cert\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.537229 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf764085-fbaa-42f8-b8ab-960398ad5c25-config\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.589500 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.619581 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.638395 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf764085-fbaa-42f8-b8ab-960398ad5c25-client-ca\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.638448 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf764085-fbaa-42f8-b8ab-960398ad5c25-proxy-ca-bundles\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.638476 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c27545-75ba-4bce-b870-96b3c4050114-config\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.638509 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9c27545-75ba-4bce-b870-96b3c4050114-client-ca\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.638558 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c27545-75ba-4bce-b870-96b3c4050114-serving-cert\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.638582 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf764085-fbaa-42f8-b8ab-960398ad5c25-serving-cert\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.638607 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf764085-fbaa-42f8-b8ab-960398ad5c25-config\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.638648 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gl26\" (UniqueName: \"kubernetes.io/projected/c9c27545-75ba-4bce-b870-96b3c4050114-kube-api-access-6gl26\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.638674 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q486d\" (UniqueName: \"kubernetes.io/projected/cf764085-fbaa-42f8-b8ab-960398ad5c25-kube-api-access-q486d\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.640582 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf764085-fbaa-42f8-b8ab-960398ad5c25-client-ca\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.640956 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9c27545-75ba-4bce-b870-96b3c4050114-client-ca\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.641277 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf764085-fbaa-42f8-b8ab-960398ad5c25-proxy-ca-bundles\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.641891 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf764085-fbaa-42f8-b8ab-960398ad5c25-config\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.643306 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9c27545-75ba-4bce-b870-96b3c4050114-config\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.650062 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9c27545-75ba-4bce-b870-96b3c4050114-serving-cert\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.654390 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf764085-fbaa-42f8-b8ab-960398ad5c25-serving-cert\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.675657 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q486d\" (UniqueName: \"kubernetes.io/projected/cf764085-fbaa-42f8-b8ab-960398ad5c25-kube-api-access-q486d\") pod \"controller-manager-7c657cbc6d-8275s\" (UID: \"cf764085-fbaa-42f8-b8ab-960398ad5c25\") " pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.677330 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gl26\" (UniqueName: \"kubernetes.io/projected/c9c27545-75ba-4bce-b870-96b3c4050114-kube-api-access-6gl26\") pod \"route-controller-manager-56cccdf6d5-zr8sw\" (UID: \"c9c27545-75ba-4bce-b870-96b3c4050114\") " pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.724941 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.734256 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.751485 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:17 crc kubenswrapper[4903]: I0320 08:27:17.811815 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.074372 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.097701 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.183094 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.263306 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.287940 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.397411 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.635649 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.676775 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.685258 4903 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.757108 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.760067 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.798888 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 08:27:18 crc kubenswrapper[4903]: I0320 08:27:18.860479 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.031397 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.098102 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.165863 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.197199 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.226749 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.288188 4903 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.368898 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.410290 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.489952 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.519840 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.534295 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.554346 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.597840 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.601658 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.728895 4903 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.755153 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.814950 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.833754 4903 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 08:27:19 crc kubenswrapper[4903]: I0320 08:27:19.998193 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.019601 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.088177 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.130273 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.299420 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.321537 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.368216 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.470513 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.494235 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.511786 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.519976 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.609887 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.704683 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.706869 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.785416 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.795375 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 08:27:20 crc kubenswrapper[4903]: E0320 08:27:20.818446 4903 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:27:20 crc kubenswrapper[4903]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-56cccdf6d5-zr8sw_openshift-route-controller-manager_c9c27545-75ba-4bce-b870-96b3c4050114_0(146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997): error adding pod openshift-route-controller-manager_route-controller-manager-56cccdf6d5-zr8sw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997" Netns:"/var/run/netns/2501da54-af57-41ba-86d0-5d9e48aaa2b7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-56cccdf6d5-zr8sw;K8S_POD_INFRA_CONTAINER_ID=146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997;K8S_POD_UID=c9c27545-75ba-4bce-b870-96b3c4050114" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw] networking: Multus: [openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw/c9c27545-75ba-4bce-b870-96b3c4050114]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-56cccdf6d5-zr8sw in out of cluster comm: pod "route-controller-manager-56cccdf6d5-zr8sw" not found Mar 20 08:27:20 crc kubenswrapper[4903]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:27:20 crc kubenswrapper[4903]: > Mar 20 08:27:20 crc kubenswrapper[4903]: E0320 08:27:20.818552 4903 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:27:20 crc kubenswrapper[4903]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-56cccdf6d5-zr8sw_openshift-route-controller-manager_c9c27545-75ba-4bce-b870-96b3c4050114_0(146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997): error adding pod openshift-route-controller-manager_route-controller-manager-56cccdf6d5-zr8sw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997" Netns:"/var/run/netns/2501da54-af57-41ba-86d0-5d9e48aaa2b7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-56cccdf6d5-zr8sw;K8S_POD_INFRA_CONTAINER_ID=146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997;K8S_POD_UID=c9c27545-75ba-4bce-b870-96b3c4050114" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw] networking: Multus: [openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw/c9c27545-75ba-4bce-b870-96b3c4050114]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-56cccdf6d5-zr8sw in out of cluster comm: pod "route-controller-manager-56cccdf6d5-zr8sw" not found Mar 20 08:27:20 crc kubenswrapper[4903]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:27:20 crc kubenswrapper[4903]: > pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:20 crc kubenswrapper[4903]: E0320 08:27:20.818577 4903 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:27:20 crc kubenswrapper[4903]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-56cccdf6d5-zr8sw_openshift-route-controller-manager_c9c27545-75ba-4bce-b870-96b3c4050114_0(146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997): error adding pod openshift-route-controller-manager_route-controller-manager-56cccdf6d5-zr8sw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997" Netns:"/var/run/netns/2501da54-af57-41ba-86d0-5d9e48aaa2b7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-56cccdf6d5-zr8sw;K8S_POD_INFRA_CONTAINER_ID=146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997;K8S_POD_UID=c9c27545-75ba-4bce-b870-96b3c4050114" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw] networking: Multus: [openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw/c9c27545-75ba-4bce-b870-96b3c4050114]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-56cccdf6d5-zr8sw in out of cluster comm: pod "route-controller-manager-56cccdf6d5-zr8sw" not found Mar 20 08:27:20 crc kubenswrapper[4903]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:27:20 crc kubenswrapper[4903]: > pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:20 crc kubenswrapper[4903]: E0320 08:27:20.818652 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-56cccdf6d5-zr8sw_openshift-route-controller-manager(c9c27545-75ba-4bce-b870-96b3c4050114)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-56cccdf6d5-zr8sw_openshift-route-controller-manager(c9c27545-75ba-4bce-b870-96b3c4050114)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-56cccdf6d5-zr8sw_openshift-route-controller-manager_c9c27545-75ba-4bce-b870-96b3c4050114_0(146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997): error adding pod openshift-route-controller-manager_route-controller-manager-56cccdf6d5-zr8sw to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997\\\" Netns:\\\"/var/run/netns/2501da54-af57-41ba-86d0-5d9e48aaa2b7\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-56cccdf6d5-zr8sw;K8S_POD_INFRA_CONTAINER_ID=146d9819264ffcf22bdac9ce176dbfc22965756c7316d187d5652f05adc9b997;K8S_POD_UID=c9c27545-75ba-4bce-b870-96b3c4050114\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw] networking: Multus: [openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw/c9c27545-75ba-4bce-b870-96b3c4050114]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-56cccdf6d5-zr8sw in out of cluster comm: pod \\\"route-controller-manager-56cccdf6d5-zr8sw\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" podUID="c9c27545-75ba-4bce-b870-96b3c4050114" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.833795 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.833869 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.835019 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.866940 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.892006 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 08:27:20 crc kubenswrapper[4903]: E0320 08:27:20.897559 4903 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 08:27:20 crc kubenswrapper[4903]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7c657cbc6d-8275s_openshift-controller-manager_cf764085-fbaa-42f8-b8ab-960398ad5c25_0(6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e): error adding pod openshift-controller-manager_controller-manager-7c657cbc6d-8275s to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e" Netns:"/var/run/netns/8a5dec49-6ad8-4107-824e-85b4cf7f12db" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7c657cbc6d-8275s;K8S_POD_INFRA_CONTAINER_ID=6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e;K8S_POD_UID=cf764085-fbaa-42f8-b8ab-960398ad5c25" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7c657cbc6d-8275s] networking: Multus: [openshift-controller-manager/controller-manager-7c657cbc6d-8275s/cf764085-fbaa-42f8-b8ab-960398ad5c25]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-7c657cbc6d-8275s in out of cluster comm: pod "controller-manager-7c657cbc6d-8275s" not found Mar 20 08:27:20 crc kubenswrapper[4903]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:27:20 crc kubenswrapper[4903]: > Mar 20 08:27:20 crc kubenswrapper[4903]: E0320 08:27:20.897641 4903 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 08:27:20 crc kubenswrapper[4903]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7c657cbc6d-8275s_openshift-controller-manager_cf764085-fbaa-42f8-b8ab-960398ad5c25_0(6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e): error adding pod openshift-controller-manager_controller-manager-7c657cbc6d-8275s to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e" Netns:"/var/run/netns/8a5dec49-6ad8-4107-824e-85b4cf7f12db" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7c657cbc6d-8275s;K8S_POD_INFRA_CONTAINER_ID=6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e;K8S_POD_UID=cf764085-fbaa-42f8-b8ab-960398ad5c25" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7c657cbc6d-8275s] networking: Multus: [openshift-controller-manager/controller-manager-7c657cbc6d-8275s/cf764085-fbaa-42f8-b8ab-960398ad5c25]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-7c657cbc6d-8275s in out of cluster comm: pod "controller-manager-7c657cbc6d-8275s" not found Mar 20 08:27:20 crc kubenswrapper[4903]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:27:20 crc kubenswrapper[4903]: > pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.897659 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 08:27:20 crc kubenswrapper[4903]: E0320 08:27:20.897671 4903 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 08:27:20 crc kubenswrapper[4903]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7c657cbc6d-8275s_openshift-controller-manager_cf764085-fbaa-42f8-b8ab-960398ad5c25_0(6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e): error adding pod openshift-controller-manager_controller-manager-7c657cbc6d-8275s to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e" Netns:"/var/run/netns/8a5dec49-6ad8-4107-824e-85b4cf7f12db" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7c657cbc6d-8275s;K8S_POD_INFRA_CONTAINER_ID=6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e;K8S_POD_UID=cf764085-fbaa-42f8-b8ab-960398ad5c25" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7c657cbc6d-8275s] networking: Multus: [openshift-controller-manager/controller-manager-7c657cbc6d-8275s/cf764085-fbaa-42f8-b8ab-960398ad5c25]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-7c657cbc6d-8275s in out of cluster comm: pod "controller-manager-7c657cbc6d-8275s" not found Mar 20 08:27:20 crc kubenswrapper[4903]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 08:27:20 crc kubenswrapper[4903]: > pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:20 crc kubenswrapper[4903]: E0320 08:27:20.897795 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-7c657cbc6d-8275s_openshift-controller-manager(cf764085-fbaa-42f8-b8ab-960398ad5c25)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-7c657cbc6d-8275s_openshift-controller-manager(cf764085-fbaa-42f8-b8ab-960398ad5c25)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7c657cbc6d-8275s_openshift-controller-manager_cf764085-fbaa-42f8-b8ab-960398ad5c25_0(6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e): error adding pod openshift-controller-manager_controller-manager-7c657cbc6d-8275s to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e\\\" Netns:\\\"/var/run/netns/8a5dec49-6ad8-4107-824e-85b4cf7f12db\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7c657cbc6d-8275s;K8S_POD_INFRA_CONTAINER_ID=6ff5ad7bd5829a11b1a1d879eb1b7780f0b392d439ecd49eec795a10834c3c8e;K8S_POD_UID=cf764085-fbaa-42f8-b8ab-960398ad5c25\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7c657cbc6d-8275s] networking: Multus: [openshift-controller-manager/controller-manager-7c657cbc6d-8275s/cf764085-fbaa-42f8-b8ab-960398ad5c25]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-7c657cbc6d-8275s in out of cluster comm: pod \\\"controller-manager-7c657cbc6d-8275s\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" podUID="cf764085-fbaa-42f8-b8ab-960398ad5c25" Mar 20 08:27:20 crc kubenswrapper[4903]: I0320 08:27:20.911581 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.052248 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.056900 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.298342 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.407260 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.422975 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.505760 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.534460 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.592411 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.728606 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.794336 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.847110 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 08:27:21 crc kubenswrapper[4903]: I0320 08:27:21.862133 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.005328 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.052517 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.084384 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.094880 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.132693 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.133365 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.261462 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.270891 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.301422 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.340233 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.471632 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.572237 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.603712 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.657496 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.804443 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.852580 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.923900 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:27:22 crc kubenswrapper[4903]: I0320 08:27:22.969216 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.178336 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.244992 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.277774 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.294845 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.336259 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.386233 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.426674 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.517434 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.541996 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.636276 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.664241 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.690342 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.708151 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.769473 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.892293 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.973722 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 08:27:23 crc kubenswrapper[4903]: I0320 08:27:23.982831 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.034967 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.044451 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.044632 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.166801 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.176063 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.264840 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.400019 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.442099 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.484101 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.515438 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.526508 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.544884 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.598463 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.621790 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.625354 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.640143 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.645069 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.715331 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.735001 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.850875 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.862847 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.880852 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.885482 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.953386 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.954328 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 08:27:24 crc kubenswrapper[4903]: I0320 08:27:24.990372 4903 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.008545 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.130119 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.131614 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.191302 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.254764 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.302501 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.342578 4903 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.343372 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859" gracePeriod=5 Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.398683 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.478000 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.507517 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.536469 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.574791 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.680078 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.689826 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.737390 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.767804 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.791899 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.850803 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.979526 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 08:27:25 crc kubenswrapper[4903]: I0320 08:27:25.980229 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.034064 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.243962 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.312852 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.326447 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.334368 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.398651 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.482251 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.557020 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.728633 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.834295 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.846657 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.972148 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 08:27:26 crc kubenswrapper[4903]: I0320 08:27:26.987704 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.006356 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.027840 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.030849 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.062346 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.065244 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.109674 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.109805 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.112148 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.125159 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.135386 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.136401 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.153788 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.238598 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.253553 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.284633 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.502862 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.665765 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.709110 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.744425 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.751094 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.776074 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 08:27:27 crc kubenswrapper[4903]: I0320 08:27:27.918782 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 08:27:28 crc kubenswrapper[4903]: I0320 08:27:28.139136 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 08:27:28 crc kubenswrapper[4903]: I0320 08:27:28.436660 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 08:27:28 crc kubenswrapper[4903]: I0320 08:27:28.516185 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 08:27:28 crc kubenswrapper[4903]: I0320 08:27:28.648735 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 08:27:28 crc kubenswrapper[4903]: I0320 08:27:28.651649 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 08:27:28 crc kubenswrapper[4903]: I0320 08:27:28.743533 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 08:27:28 crc kubenswrapper[4903]: I0320 08:27:28.750231 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 08:27:28 crc kubenswrapper[4903]: I0320 08:27:28.876528 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 08:27:28 crc kubenswrapper[4903]: I0320 08:27:28.933242 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 08:27:28 crc kubenswrapper[4903]: I0320 08:27:28.959966 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 08:27:29 crc kubenswrapper[4903]: I0320 08:27:29.010905 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 08:27:29 crc kubenswrapper[4903]: I0320 08:27:29.086944 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 08:27:29 crc kubenswrapper[4903]: I0320 08:27:29.153613 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 08:27:29 crc kubenswrapper[4903]: I0320 08:27:29.367209 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 08:27:29 crc kubenswrapper[4903]: I0320 08:27:29.582637 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 08:27:29 crc kubenswrapper[4903]: I0320 08:27:29.964049 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 08:27:30 crc kubenswrapper[4903]: I0320 08:27:30.034502 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 08:27:30 crc kubenswrapper[4903]: I0320 08:27:30.198814 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 08:27:30 crc kubenswrapper[4903]: I0320 08:27:30.376398 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 08:27:30 crc kubenswrapper[4903]: I0320 08:27:30.404769 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 08:27:30 crc kubenswrapper[4903]: I0320 08:27:30.571674 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 08:27:30 crc kubenswrapper[4903]: I0320 08:27:30.666516 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 08:27:30 crc kubenswrapper[4903]: I0320 08:27:30.856000 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 08:27:30 crc kubenswrapper[4903]: I0320 08:27:30.928616 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 08:27:30 crc kubenswrapper[4903]: I0320 08:27:30.928776 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.054801 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.054919 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055025 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055108 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055101 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055245 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055189 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055171 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055207 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055684 4903 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055719 4903 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055738 4903 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.055761 4903 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.067272 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.094398 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.124314 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.124392 4903 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859" exitCode=137 Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.124465 4903 scope.go:117] "RemoveContainer" containerID="f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.124571 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.152833 4903 scope.go:117] "RemoveContainer" containerID="f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859" Mar 20 08:27:31 crc kubenswrapper[4903]: E0320 08:27:31.153402 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859\": container with ID starting with f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859 not found: ID does not exist" containerID="f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.153442 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859"} err="failed to get container status \"f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859\": rpc error: code = NotFound desc = could not find container \"f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859\": container with ID starting with f227b2c723bd602dcd070ef69f8007a220c4c1e1a3a90653fcfd645df24cc859 not found: ID does not exist" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.157946 4903 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.359307 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.498131 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.514343 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.517224 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 08:27:31 crc kubenswrapper[4903]: I0320 08:27:31.929999 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 08:27:32 crc kubenswrapper[4903]: I0320 08:27:32.344556 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 08:27:32 crc kubenswrapper[4903]: I0320 08:27:32.413707 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 08:27:33 crc kubenswrapper[4903]: I0320 08:27:33.131699 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 08:27:33 crc kubenswrapper[4903]: I0320 08:27:33.133986 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 08:27:33 crc kubenswrapper[4903]: I0320 08:27:33.913312 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 08:27:34 crc kubenswrapper[4903]: I0320 08:27:34.490809 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:34 crc kubenswrapper[4903]: I0320 08:27:34.491497 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:35 crc kubenswrapper[4903]: W0320 08:27:35.038156 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9c27545_75ba_4bce_b870_96b3c4050114.slice/crio-94588d0acf83a700e3f2f332919b085020ad7fd0e5cebcc6f1f421ef8c2ac366 WatchSource:0}: Error finding container 94588d0acf83a700e3f2f332919b085020ad7fd0e5cebcc6f1f421ef8c2ac366: Status 404 returned error can't find the container with id 94588d0acf83a700e3f2f332919b085020ad7fd0e5cebcc6f1f421ef8c2ac366 Mar 20 08:27:35 crc kubenswrapper[4903]: I0320 08:27:35.039126 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw"] Mar 20 08:27:35 crc kubenswrapper[4903]: I0320 08:27:35.160970 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" event={"ID":"c9c27545-75ba-4bce-b870-96b3c4050114","Type":"ContainerStarted","Data":"94588d0acf83a700e3f2f332919b085020ad7fd0e5cebcc6f1f421ef8c2ac366"} Mar 20 08:27:36 crc kubenswrapper[4903]: I0320 08:27:36.170910 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" event={"ID":"c9c27545-75ba-4bce-b870-96b3c4050114","Type":"ContainerStarted","Data":"f028652f8d7ea8ce7919fb38c9416bbc10200d76c6db2c7c0671972acced83fa"} Mar 20 08:27:36 crc kubenswrapper[4903]: I0320 08:27:36.171600 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:36 crc kubenswrapper[4903]: I0320 08:27:36.179731 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" Mar 20 08:27:36 crc kubenswrapper[4903]: I0320 08:27:36.197841 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56cccdf6d5-zr8sw" podStartSLOduration=54.197821838 podStartE2EDuration="54.197821838s" podCreationTimestamp="2026-03-20 08:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:27:36.195548579 +0000 UTC m=+281.412448914" watchObservedRunningTime="2026-03-20 08:27:36.197821838 +0000 UTC m=+281.414722183" Mar 20 08:27:36 crc kubenswrapper[4903]: I0320 08:27:36.490081 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:36 crc kubenswrapper[4903]: I0320 08:27:36.491311 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:36 crc kubenswrapper[4903]: I0320 08:27:36.968476 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c657cbc6d-8275s"] Mar 20 08:27:36 crc kubenswrapper[4903]: W0320 08:27:36.982339 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf764085_fbaa_42f8_b8ab_960398ad5c25.slice/crio-ef25c81c2c958183d0b6659277f7305f19904d2061cf824ec6a2ab583fb03814 WatchSource:0}: Error finding container ef25c81c2c958183d0b6659277f7305f19904d2061cf824ec6a2ab583fb03814: Status 404 returned error can't find the container with id ef25c81c2c958183d0b6659277f7305f19904d2061cf824ec6a2ab583fb03814 Mar 20 08:27:37 crc kubenswrapper[4903]: I0320 08:27:37.180751 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" event={"ID":"cf764085-fbaa-42f8-b8ab-960398ad5c25","Type":"ContainerStarted","Data":"61f9b18d6059417b50c91db7c6b454f9b9143100de3d810b43e30115467c49c6"} Mar 20 08:27:37 crc kubenswrapper[4903]: I0320 08:27:37.183130 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" event={"ID":"cf764085-fbaa-42f8-b8ab-960398ad5c25","Type":"ContainerStarted","Data":"ef25c81c2c958183d0b6659277f7305f19904d2061cf824ec6a2ab583fb03814"} Mar 20 08:27:37 crc kubenswrapper[4903]: I0320 08:27:37.211697 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" podStartSLOduration=55.211667431 podStartE2EDuration="55.211667431s" podCreationTimestamp="2026-03-20 08:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:27:37.206705613 +0000 UTC m=+282.423605928" watchObservedRunningTime="2026-03-20 08:27:37.211667431 +0000 UTC m=+282.428567756" Mar 20 08:27:37 crc kubenswrapper[4903]: I0320 08:27:37.734687 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:37 crc kubenswrapper[4903]: I0320 08:27:37.739998 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c657cbc6d-8275s" Mar 20 08:27:50 crc kubenswrapper[4903]: I0320 08:27:50.834253 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:27:50 crc kubenswrapper[4903]: I0320 08:27:50.835304 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:27:55 crc kubenswrapper[4903]: I0320 08:27:55.318011 4903 generic.go:334] "Generic (PLEG): container finished" podID="4742c0a9-7786-4b7e-823e-e70630e72495" containerID="3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73" exitCode=0 Mar 20 08:27:55 crc kubenswrapper[4903]: I0320 08:27:55.318308 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" event={"ID":"4742c0a9-7786-4b7e-823e-e70630e72495","Type":"ContainerDied","Data":"3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73"} Mar 20 08:27:55 crc kubenswrapper[4903]: I0320 08:27:55.319482 4903 scope.go:117] "RemoveContainer" containerID="3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73" Mar 20 08:27:56 crc kubenswrapper[4903]: I0320 08:27:56.341905 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" event={"ID":"4742c0a9-7786-4b7e-823e-e70630e72495","Type":"ContainerStarted","Data":"21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa"} Mar 20 08:27:56 crc kubenswrapper[4903]: I0320 08:27:56.344204 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:27:56 crc kubenswrapper[4903]: I0320 08:27:56.348145 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.215213 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566588-4mjc5"] Mar 20 08:28:00 crc kubenswrapper[4903]: E0320 08:28:00.216601 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.216628 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.216816 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.217527 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566588-4mjc5" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.220741 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.220746 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.223440 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.238798 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566588-4mjc5"] Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.355591 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clqss\" (UniqueName: \"kubernetes.io/projected/a63c6845-cfdb-46dc-8ab0-39c7d1a366d2-kube-api-access-clqss\") pod \"auto-csr-approver-29566588-4mjc5\" (UID: \"a63c6845-cfdb-46dc-8ab0-39c7d1a366d2\") " pod="openshift-infra/auto-csr-approver-29566588-4mjc5" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.457374 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clqss\" (UniqueName: \"kubernetes.io/projected/a63c6845-cfdb-46dc-8ab0-39c7d1a366d2-kube-api-access-clqss\") pod \"auto-csr-approver-29566588-4mjc5\" (UID: \"a63c6845-cfdb-46dc-8ab0-39c7d1a366d2\") " pod="openshift-infra/auto-csr-approver-29566588-4mjc5" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.482289 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clqss\" (UniqueName: \"kubernetes.io/projected/a63c6845-cfdb-46dc-8ab0-39c7d1a366d2-kube-api-access-clqss\") pod \"auto-csr-approver-29566588-4mjc5\" (UID: \"a63c6845-cfdb-46dc-8ab0-39c7d1a366d2\") " pod="openshift-infra/auto-csr-approver-29566588-4mjc5" Mar 20 08:28:00 crc kubenswrapper[4903]: I0320 08:28:00.557606 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566588-4mjc5" Mar 20 08:28:01 crc kubenswrapper[4903]: I0320 08:28:01.112491 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566588-4mjc5"] Mar 20 08:28:01 crc kubenswrapper[4903]: W0320 08:28:01.120840 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda63c6845_cfdb_46dc_8ab0_39c7d1a366d2.slice/crio-45f75b904e8eafc6ddddfcd972e835d5e9178d1f3dee9b0482dcd51d3babbd89 WatchSource:0}: Error finding container 45f75b904e8eafc6ddddfcd972e835d5e9178d1f3dee9b0482dcd51d3babbd89: Status 404 returned error can't find the container with id 45f75b904e8eafc6ddddfcd972e835d5e9178d1f3dee9b0482dcd51d3babbd89 Mar 20 08:28:01 crc kubenswrapper[4903]: I0320 08:28:01.387962 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566588-4mjc5" event={"ID":"a63c6845-cfdb-46dc-8ab0-39c7d1a366d2","Type":"ContainerStarted","Data":"45f75b904e8eafc6ddddfcd972e835d5e9178d1f3dee9b0482dcd51d3babbd89"} Mar 20 08:28:03 crc kubenswrapper[4903]: I0320 08:28:03.416776 4903 generic.go:334] "Generic (PLEG): container finished" podID="a63c6845-cfdb-46dc-8ab0-39c7d1a366d2" containerID="3a6045dca4e996436358344c9adc794f5c5a7b271aa6124aca1f8627ecba66d4" exitCode=0 Mar 20 08:28:03 crc kubenswrapper[4903]: I0320 08:28:03.416877 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566588-4mjc5" event={"ID":"a63c6845-cfdb-46dc-8ab0-39c7d1a366d2","Type":"ContainerDied","Data":"3a6045dca4e996436358344c9adc794f5c5a7b271aa6124aca1f8627ecba66d4"} Mar 20 08:28:04 crc kubenswrapper[4903]: I0320 08:28:04.884774 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566588-4mjc5" Mar 20 08:28:04 crc kubenswrapper[4903]: I0320 08:28:04.951678 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clqss\" (UniqueName: \"kubernetes.io/projected/a63c6845-cfdb-46dc-8ab0-39c7d1a366d2-kube-api-access-clqss\") pod \"a63c6845-cfdb-46dc-8ab0-39c7d1a366d2\" (UID: \"a63c6845-cfdb-46dc-8ab0-39c7d1a366d2\") " Mar 20 08:28:04 crc kubenswrapper[4903]: I0320 08:28:04.959378 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63c6845-cfdb-46dc-8ab0-39c7d1a366d2-kube-api-access-clqss" (OuterVolumeSpecName: "kube-api-access-clqss") pod "a63c6845-cfdb-46dc-8ab0-39c7d1a366d2" (UID: "a63c6845-cfdb-46dc-8ab0-39c7d1a366d2"). InnerVolumeSpecName "kube-api-access-clqss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:28:05 crc kubenswrapper[4903]: I0320 08:28:05.052840 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clqss\" (UniqueName: \"kubernetes.io/projected/a63c6845-cfdb-46dc-8ab0-39c7d1a366d2-kube-api-access-clqss\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:05 crc kubenswrapper[4903]: I0320 08:28:05.435385 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566588-4mjc5" event={"ID":"a63c6845-cfdb-46dc-8ab0-39c7d1a366d2","Type":"ContainerDied","Data":"45f75b904e8eafc6ddddfcd972e835d5e9178d1f3dee9b0482dcd51d3babbd89"} Mar 20 08:28:05 crc kubenswrapper[4903]: I0320 08:28:05.435473 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45f75b904e8eafc6ddddfcd972e835d5e9178d1f3dee9b0482dcd51d3babbd89" Mar 20 08:28:05 crc kubenswrapper[4903]: I0320 08:28:05.435491 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566588-4mjc5" Mar 20 08:28:20 crc kubenswrapper[4903]: I0320 08:28:20.834268 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:28:20 crc kubenswrapper[4903]: I0320 08:28:20.835508 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:28:20 crc kubenswrapper[4903]: I0320 08:28:20.835607 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:28:20 crc kubenswrapper[4903]: I0320 08:28:20.836868 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97230b86d8abf05de23db14ac7a3f5d775800a1072bcb8f41fc0bb22c84b0942"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:28:20 crc kubenswrapper[4903]: I0320 08:28:20.836986 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://97230b86d8abf05de23db14ac7a3f5d775800a1072bcb8f41fc0bb22c84b0942" gracePeriod=600 Mar 20 08:28:21 crc kubenswrapper[4903]: I0320 08:28:21.559737 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="97230b86d8abf05de23db14ac7a3f5d775800a1072bcb8f41fc0bb22c84b0942" exitCode=0 Mar 20 08:28:21 crc kubenswrapper[4903]: I0320 08:28:21.560680 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"97230b86d8abf05de23db14ac7a3f5d775800a1072bcb8f41fc0bb22c84b0942"} Mar 20 08:28:21 crc kubenswrapper[4903]: I0320 08:28:21.560720 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"6fdfb30d5b87cf452b7cc070a07a38b8d79152d6ba5ebf1ea0c265e7f4d3d787"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.058502 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nwkzx"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.059607 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nwkzx" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerName="registry-server" containerID="cri-o://3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee" gracePeriod=30 Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.064857 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fzcbm"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.065739 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fzcbm" podUID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerName="registry-server" containerID="cri-o://8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a" gracePeriod=30 Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.075812 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbxgq"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.076100 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" podUID="4742c0a9-7786-4b7e-823e-e70630e72495" containerName="marketplace-operator" containerID="cri-o://21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa" gracePeriod=30 Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.096059 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxn5d"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.096426 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cxn5d" podUID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerName="registry-server" containerID="cri-o://71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc" gracePeriod=30 Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.105435 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4mktd"] Mar 20 08:28:50 crc kubenswrapper[4903]: E0320 08:28:50.105700 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63c6845-cfdb-46dc-8ab0-39c7d1a366d2" containerName="oc" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.105713 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63c6845-cfdb-46dc-8ab0-39c7d1a366d2" containerName="oc" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.105806 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a63c6845-cfdb-46dc-8ab0-39c7d1a366d2" containerName="oc" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.106235 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.117054 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbz9z"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.117729 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pbz9z" podUID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerName="registry-server" containerID="cri-o://34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112" gracePeriod=30 Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.123679 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4mktd"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.210021 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g2g8\" (UniqueName: \"kubernetes.io/projected/60e4db67-4f24-43be-a77c-bbf913fa9f4a-kube-api-access-2g2g8\") pod \"marketplace-operator-79b997595-4mktd\" (UID: \"60e4db67-4f24-43be-a77c-bbf913fa9f4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.210088 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60e4db67-4f24-43be-a77c-bbf913fa9f4a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4mktd\" (UID: \"60e4db67-4f24-43be-a77c-bbf913fa9f4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.210229 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60e4db67-4f24-43be-a77c-bbf913fa9f4a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4mktd\" (UID: \"60e4db67-4f24-43be-a77c-bbf913fa9f4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.312117 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g2g8\" (UniqueName: \"kubernetes.io/projected/60e4db67-4f24-43be-a77c-bbf913fa9f4a-kube-api-access-2g2g8\") pod \"marketplace-operator-79b997595-4mktd\" (UID: \"60e4db67-4f24-43be-a77c-bbf913fa9f4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.312169 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60e4db67-4f24-43be-a77c-bbf913fa9f4a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4mktd\" (UID: \"60e4db67-4f24-43be-a77c-bbf913fa9f4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.312215 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60e4db67-4f24-43be-a77c-bbf913fa9f4a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4mktd\" (UID: \"60e4db67-4f24-43be-a77c-bbf913fa9f4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.314611 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60e4db67-4f24-43be-a77c-bbf913fa9f4a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4mktd\" (UID: \"60e4db67-4f24-43be-a77c-bbf913fa9f4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.322092 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60e4db67-4f24-43be-a77c-bbf913fa9f4a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4mktd\" (UID: \"60e4db67-4f24-43be-a77c-bbf913fa9f4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.332688 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g2g8\" (UniqueName: \"kubernetes.io/projected/60e4db67-4f24-43be-a77c-bbf913fa9f4a-kube-api-access-2g2g8\") pod \"marketplace-operator-79b997595-4mktd\" (UID: \"60e4db67-4f24-43be-a77c-bbf913fa9f4a\") " pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.494624 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.506979 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.574729 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.615681 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6svbl\" (UniqueName: \"kubernetes.io/projected/738071fe-1a7b-403b-ab94-8e88d5d79ab4-kube-api-access-6svbl\") pod \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.615732 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g98zj\" (UniqueName: \"kubernetes.io/projected/e74b75cf-cad8-4b67-91b7-3926096e09f8-kube-api-access-g98zj\") pod \"e74b75cf-cad8-4b67-91b7-3926096e09f8\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.615766 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-catalog-content\") pod \"e74b75cf-cad8-4b67-91b7-3926096e09f8\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.615798 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-catalog-content\") pod \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.615854 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-utilities\") pod \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\" (UID: \"738071fe-1a7b-403b-ab94-8e88d5d79ab4\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.615876 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-utilities\") pod \"e74b75cf-cad8-4b67-91b7-3926096e09f8\" (UID: \"e74b75cf-cad8-4b67-91b7-3926096e09f8\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.618376 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-utilities" (OuterVolumeSpecName: "utilities") pod "e74b75cf-cad8-4b67-91b7-3926096e09f8" (UID: "e74b75cf-cad8-4b67-91b7-3926096e09f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.619253 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.619706 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.621202 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/738071fe-1a7b-403b-ab94-8e88d5d79ab4-kube-api-access-6svbl" (OuterVolumeSpecName: "kube-api-access-6svbl") pod "738071fe-1a7b-403b-ab94-8e88d5d79ab4" (UID: "738071fe-1a7b-403b-ab94-8e88d5d79ab4"). InnerVolumeSpecName "kube-api-access-6svbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.623594 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.625161 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-utilities" (OuterVolumeSpecName: "utilities") pod "738071fe-1a7b-403b-ab94-8e88d5d79ab4" (UID: "738071fe-1a7b-403b-ab94-8e88d5d79ab4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.625600 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e74b75cf-cad8-4b67-91b7-3926096e09f8-kube-api-access-g98zj" (OuterVolumeSpecName: "kube-api-access-g98zj") pod "e74b75cf-cad8-4b67-91b7-3926096e09f8" (UID: "e74b75cf-cad8-4b67-91b7-3926096e09f8"). InnerVolumeSpecName "kube-api-access-g98zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.669889 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.685410 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "738071fe-1a7b-403b-ab94-8e88d5d79ab4" (UID: "738071fe-1a7b-403b-ab94-8e88d5d79ab4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.701816 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e74b75cf-cad8-4b67-91b7-3926096e09f8" (UID: "e74b75cf-cad8-4b67-91b7-3926096e09f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720247 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-operator-metrics\") pod \"4742c0a9-7786-4b7e-823e-e70630e72495\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720296 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-catalog-content\") pod \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720326 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42szb\" (UniqueName: \"kubernetes.io/projected/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-kube-api-access-42szb\") pod \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720355 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-catalog-content\") pod \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720385 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsp9w\" (UniqueName: \"kubernetes.io/projected/37ce866b-65c1-454a-b346-43c2ebe9a2e0-kube-api-access-zsp9w\") pod \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720417 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-trusted-ca\") pod \"4742c0a9-7786-4b7e-823e-e70630e72495\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720449 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-utilities\") pod \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\" (UID: \"48da3e1a-ed3d-4048-8f10-39f1cc56d9af\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720475 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2z4h\" (UniqueName: \"kubernetes.io/projected/4742c0a9-7786-4b7e-823e-e70630e72495-kube-api-access-w2z4h\") pod \"4742c0a9-7786-4b7e-823e-e70630e72495\" (UID: \"4742c0a9-7786-4b7e-823e-e70630e72495\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720510 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-utilities\") pod \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\" (UID: \"37ce866b-65c1-454a-b346-43c2ebe9a2e0\") " Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720727 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6svbl\" (UniqueName: \"kubernetes.io/projected/738071fe-1a7b-403b-ab94-8e88d5d79ab4-kube-api-access-6svbl\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720740 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g98zj\" (UniqueName: \"kubernetes.io/projected/e74b75cf-cad8-4b67-91b7-3926096e09f8-kube-api-access-g98zj\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720749 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e74b75cf-cad8-4b67-91b7-3926096e09f8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720756 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.720765 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/738071fe-1a7b-403b-ab94-8e88d5d79ab4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.721647 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-utilities" (OuterVolumeSpecName: "utilities") pod "37ce866b-65c1-454a-b346-43c2ebe9a2e0" (UID: "37ce866b-65c1-454a-b346-43c2ebe9a2e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.721632 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-utilities" (OuterVolumeSpecName: "utilities") pod "48da3e1a-ed3d-4048-8f10-39f1cc56d9af" (UID: "48da3e1a-ed3d-4048-8f10-39f1cc56d9af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.722713 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "4742c0a9-7786-4b7e-823e-e70630e72495" (UID: "4742c0a9-7786-4b7e-823e-e70630e72495"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.725236 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-kube-api-access-42szb" (OuterVolumeSpecName: "kube-api-access-42szb") pod "48da3e1a-ed3d-4048-8f10-39f1cc56d9af" (UID: "48da3e1a-ed3d-4048-8f10-39f1cc56d9af"). InnerVolumeSpecName "kube-api-access-42szb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.725871 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ce866b-65c1-454a-b346-43c2ebe9a2e0-kube-api-access-zsp9w" (OuterVolumeSpecName: "kube-api-access-zsp9w") pod "37ce866b-65c1-454a-b346-43c2ebe9a2e0" (UID: "37ce866b-65c1-454a-b346-43c2ebe9a2e0"). InnerVolumeSpecName "kube-api-access-zsp9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.725908 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4742c0a9-7786-4b7e-823e-e70630e72495-kube-api-access-w2z4h" (OuterVolumeSpecName: "kube-api-access-w2z4h") pod "4742c0a9-7786-4b7e-823e-e70630e72495" (UID: "4742c0a9-7786-4b7e-823e-e70630e72495"). InnerVolumeSpecName "kube-api-access-w2z4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.730727 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "4742c0a9-7786-4b7e-823e-e70630e72495" (UID: "4742c0a9-7786-4b7e-823e-e70630e72495"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.755929 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48da3e1a-ed3d-4048-8f10-39f1cc56d9af" (UID: "48da3e1a-ed3d-4048-8f10-39f1cc56d9af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.787942 4903 generic.go:334] "Generic (PLEG): container finished" podID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerID="71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc" exitCode=0 Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.788079 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxn5d" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.788094 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxn5d" event={"ID":"48da3e1a-ed3d-4048-8f10-39f1cc56d9af","Type":"ContainerDied","Data":"71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.788127 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxn5d" event={"ID":"48da3e1a-ed3d-4048-8f10-39f1cc56d9af","Type":"ContainerDied","Data":"3346aa5a19efac517e2753f55fdb4fad1a983faed5b094620f977aba39c61c7d"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.788181 4903 scope.go:117] "RemoveContainer" containerID="71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.796459 4903 generic.go:334] "Generic (PLEG): container finished" podID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerID="3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee" exitCode=0 Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.796527 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nwkzx" event={"ID":"e74b75cf-cad8-4b67-91b7-3926096e09f8","Type":"ContainerDied","Data":"3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.796559 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nwkzx" event={"ID":"e74b75cf-cad8-4b67-91b7-3926096e09f8","Type":"ContainerDied","Data":"68332149b84360bcd17d633c1dff27018b738465a9211ebce7d6c3a39420e93f"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.796628 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nwkzx" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.807511 4903 scope.go:117] "RemoveContainer" containerID="3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.812406 4903 generic.go:334] "Generic (PLEG): container finished" podID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerID="8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a" exitCode=0 Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.812486 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzcbm" event={"ID":"738071fe-1a7b-403b-ab94-8e88d5d79ab4","Type":"ContainerDied","Data":"8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.812531 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzcbm" event={"ID":"738071fe-1a7b-403b-ab94-8e88d5d79ab4","Type":"ContainerDied","Data":"37b0e18be858109c70513f4906f3c29dd57c8fdce0e75cec9785f4a9a5cf31b5"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.812532 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzcbm" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.818888 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.818788 4903 generic.go:334] "Generic (PLEG): container finished" podID="4742c0a9-7786-4b7e-823e-e70630e72495" containerID="21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa" exitCode=0 Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.821621 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" event={"ID":"4742c0a9-7786-4b7e-823e-e70630e72495","Type":"ContainerDied","Data":"21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.821663 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zbxgq" event={"ID":"4742c0a9-7786-4b7e-823e-e70630e72495","Type":"ContainerDied","Data":"8bf7329c316f1e0a3a18b0ff0b487e59c79f3209a3557ab8ba91b201541751e3"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.822802 4903 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.822836 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42szb\" (UniqueName: \"kubernetes.io/projected/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-kube-api-access-42szb\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.822849 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.822862 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsp9w\" (UniqueName: \"kubernetes.io/projected/37ce866b-65c1-454a-b346-43c2ebe9a2e0-kube-api-access-zsp9w\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.822874 4903 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4742c0a9-7786-4b7e-823e-e70630e72495-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.822886 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48da3e1a-ed3d-4048-8f10-39f1cc56d9af-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.822899 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2z4h\" (UniqueName: \"kubernetes.io/projected/4742c0a9-7786-4b7e-823e-e70630e72495-kube-api-access-w2z4h\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.822931 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.826086 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxn5d"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.833330 4903 generic.go:334] "Generic (PLEG): container finished" podID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerID="34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112" exitCode=0 Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.833383 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbz9z" event={"ID":"37ce866b-65c1-454a-b346-43c2ebe9a2e0","Type":"ContainerDied","Data":"34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.833418 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbz9z" event={"ID":"37ce866b-65c1-454a-b346-43c2ebe9a2e0","Type":"ContainerDied","Data":"d3a9f3b1a68766ea5ac0beaa0774e29abd5ac8b5420dc1e3856a7ec473ec54bc"} Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.833509 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbz9z" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.835233 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxn5d"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.842976 4903 scope.go:117] "RemoveContainer" containerID="968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.845380 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nwkzx"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.868401 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nwkzx"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.874700 4903 scope.go:117] "RemoveContainer" containerID="71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc" Mar 20 08:28:50 crc kubenswrapper[4903]: E0320 08:28:50.875409 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc\": container with ID starting with 71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc not found: ID does not exist" containerID="71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.875484 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc"} err="failed to get container status \"71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc\": rpc error: code = NotFound desc = could not find container \"71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc\": container with ID starting with 71f92575d13f24b62794159747a281add9f29e61ffc595387b66c08df22bcbbc not found: ID does not exist" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.875510 4903 scope.go:117] "RemoveContainer" containerID="3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609" Mar 20 08:28:50 crc kubenswrapper[4903]: E0320 08:28:50.875954 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609\": container with ID starting with 3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609 not found: ID does not exist" containerID="3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.875984 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609"} err="failed to get container status \"3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609\": rpc error: code = NotFound desc = could not find container \"3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609\": container with ID starting with 3794c660e6384e78a401f187b22e15251d1f31b133b3e9616e1649b816f21609 not found: ID does not exist" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.876006 4903 scope.go:117] "RemoveContainer" containerID="968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c" Mar 20 08:28:50 crc kubenswrapper[4903]: E0320 08:28:50.876318 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c\": container with ID starting with 968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c not found: ID does not exist" containerID="968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.876340 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c"} err="failed to get container status \"968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c\": rpc error: code = NotFound desc = could not find container \"968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c\": container with ID starting with 968bbbe7f26e784d6a4ccfae8e2203cfc7d232fbbc4823100ae881dafe8a1a8c not found: ID does not exist" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.876354 4903 scope.go:117] "RemoveContainer" containerID="3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.881078 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbxgq"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.885140 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zbxgq"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.888359 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fzcbm"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.891116 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fzcbm"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.895313 4903 scope.go:117] "RemoveContainer" containerID="36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.909356 4903 scope.go:117] "RemoveContainer" containerID="b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.920855 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37ce866b-65c1-454a-b346-43c2ebe9a2e0" (UID: "37ce866b-65c1-454a-b346-43c2ebe9a2e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.921676 4903 scope.go:117] "RemoveContainer" containerID="3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee" Mar 20 08:28:50 crc kubenswrapper[4903]: E0320 08:28:50.922012 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee\": container with ID starting with 3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee not found: ID does not exist" containerID="3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.922058 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee"} err="failed to get container status \"3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee\": rpc error: code = NotFound desc = could not find container \"3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee\": container with ID starting with 3168467e8e7ef14dbe452c36fbcfc5a9dcbc06bc8b694faa685567d008ff52ee not found: ID does not exist" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.922082 4903 scope.go:117] "RemoveContainer" containerID="36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338" Mar 20 08:28:50 crc kubenswrapper[4903]: E0320 08:28:50.922478 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338\": container with ID starting with 36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338 not found: ID does not exist" containerID="36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.922499 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338"} err="failed to get container status \"36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338\": rpc error: code = NotFound desc = could not find container \"36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338\": container with ID starting with 36da9891300bc47d6eed9291600104590aaac4b3249ab7068747f71ef6abb338 not found: ID does not exist" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.922607 4903 scope.go:117] "RemoveContainer" containerID="b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf" Mar 20 08:28:50 crc kubenswrapper[4903]: E0320 08:28:50.922815 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf\": container with ID starting with b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf not found: ID does not exist" containerID="b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.922834 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf"} err="failed to get container status \"b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf\": rpc error: code = NotFound desc = could not find container \"b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf\": container with ID starting with b68515b470ead18654c7af5896534fded184b7b1ce059931728b8ae5e2b91baf not found: ID does not exist" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.922847 4903 scope.go:117] "RemoveContainer" containerID="8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.924686 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ce866b-65c1-454a-b346-43c2ebe9a2e0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.936324 4903 scope.go:117] "RemoveContainer" containerID="7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.953221 4903 scope.go:117] "RemoveContainer" containerID="73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.981091 4903 scope.go:117] "RemoveContainer" containerID="8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a" Mar 20 08:28:50 crc kubenswrapper[4903]: E0320 08:28:50.981862 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a\": container with ID starting with 8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a not found: ID does not exist" containerID="8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.981925 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a"} err="failed to get container status \"8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a\": rpc error: code = NotFound desc = could not find container \"8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a\": container with ID starting with 8887e05f19d404ccfc3eec77cd78b6a7cd83d89a3a211cad26ace360c2a7235a not found: ID does not exist" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.981955 4903 scope.go:117] "RemoveContainer" containerID="7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601" Mar 20 08:28:50 crc kubenswrapper[4903]: E0320 08:28:50.982412 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601\": container with ID starting with 7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601 not found: ID does not exist" containerID="7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.982447 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601"} err="failed to get container status \"7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601\": rpc error: code = NotFound desc = could not find container \"7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601\": container with ID starting with 7ad31f988327c74066024ff689167ac94dad0f44511982b2fd5bbd3bb8bb0601 not found: ID does not exist" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.982469 4903 scope.go:117] "RemoveContainer" containerID="73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3" Mar 20 08:28:50 crc kubenswrapper[4903]: E0320 08:28:50.982777 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3\": container with ID starting with 73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3 not found: ID does not exist" containerID="73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.982805 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3"} err="failed to get container status \"73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3\": rpc error: code = NotFound desc = could not find container \"73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3\": container with ID starting with 73d2ec2ef02361b99bcc7f04f273b6fdf2c414e13f6ac43936322c1a1dfd17d3 not found: ID does not exist" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.982836 4903 scope.go:117] "RemoveContainer" containerID="21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa" Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.983909 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4mktd"] Mar 20 08:28:50 crc kubenswrapper[4903]: I0320 08:28:50.997724 4903 scope.go:117] "RemoveContainer" containerID="3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.011333 4903 scope.go:117] "RemoveContainer" containerID="21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa" Mar 20 08:28:51 crc kubenswrapper[4903]: E0320 08:28:51.011805 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa\": container with ID starting with 21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa not found: ID does not exist" containerID="21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.011857 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa"} err="failed to get container status \"21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa\": rpc error: code = NotFound desc = could not find container \"21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa\": container with ID starting with 21fda3bbc925646ccb732ad6dbf75a4a20898df73a41874081d7b1d7b70e73aa not found: ID does not exist" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.011941 4903 scope.go:117] "RemoveContainer" containerID="3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73" Mar 20 08:28:51 crc kubenswrapper[4903]: E0320 08:28:51.012623 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73\": container with ID starting with 3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73 not found: ID does not exist" containerID="3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.012664 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73"} err="failed to get container status \"3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73\": rpc error: code = NotFound desc = could not find container \"3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73\": container with ID starting with 3d7ff983c32a5f57ea7b4740d32404bc2b4e3d6298ebfc659831ceec3f9edc73 not found: ID does not exist" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.012691 4903 scope.go:117] "RemoveContainer" containerID="34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.031838 4903 scope.go:117] "RemoveContainer" containerID="17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.054105 4903 scope.go:117] "RemoveContainer" containerID="30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.069260 4903 scope.go:117] "RemoveContainer" containerID="34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112" Mar 20 08:28:51 crc kubenswrapper[4903]: E0320 08:28:51.069978 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112\": container with ID starting with 34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112 not found: ID does not exist" containerID="34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.070067 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112"} err="failed to get container status \"34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112\": rpc error: code = NotFound desc = could not find container \"34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112\": container with ID starting with 34589ee220193f734c20277cc438dbc9d5ccc148cb8745411dfade83ac1d0112 not found: ID does not exist" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.070122 4903 scope.go:117] "RemoveContainer" containerID="17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc" Mar 20 08:28:51 crc kubenswrapper[4903]: E0320 08:28:51.070592 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc\": container with ID starting with 17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc not found: ID does not exist" containerID="17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.070620 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc"} err="failed to get container status \"17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc\": rpc error: code = NotFound desc = could not find container \"17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc\": container with ID starting with 17551a49474d4f5b4cfa94901b8da62deb9ba3fccfb7284e5faa890432d703bc not found: ID does not exist" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.070648 4903 scope.go:117] "RemoveContainer" containerID="30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b" Mar 20 08:28:51 crc kubenswrapper[4903]: E0320 08:28:51.071199 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b\": container with ID starting with 30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b not found: ID does not exist" containerID="30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.071244 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b"} err="failed to get container status \"30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b\": rpc error: code = NotFound desc = could not find container \"30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b\": container with ID starting with 30a33cb156edaebd2249b1cb595412229313183ffd7a62bf4b252936ceef4c0b not found: ID does not exist" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.164083 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbz9z"] Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.168278 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pbz9z"] Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.498583 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" path="/var/lib/kubelet/pods/37ce866b-65c1-454a-b346-43c2ebe9a2e0/volumes" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.499421 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4742c0a9-7786-4b7e-823e-e70630e72495" path="/var/lib/kubelet/pods/4742c0a9-7786-4b7e-823e-e70630e72495/volumes" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.499865 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" path="/var/lib/kubelet/pods/48da3e1a-ed3d-4048-8f10-39f1cc56d9af/volumes" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.500444 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" path="/var/lib/kubelet/pods/738071fe-1a7b-403b-ab94-8e88d5d79ab4/volumes" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.500991 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" path="/var/lib/kubelet/pods/e74b75cf-cad8-4b67-91b7-3926096e09f8/volumes" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.848264 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" event={"ID":"60e4db67-4f24-43be-a77c-bbf913fa9f4a","Type":"ContainerStarted","Data":"687af747c4cf61dba65ff8a3155d04912b37ee2103d8b764fb03c373534900d8"} Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.848334 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" event={"ID":"60e4db67-4f24-43be-a77c-bbf913fa9f4a","Type":"ContainerStarted","Data":"c70f66d5970761af3c6c40d6ab9ebaa5b6ef0097881357e816c7fc176c561037"} Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.851986 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.855001 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" Mar 20 08:28:51 crc kubenswrapper[4903]: I0320 08:28:51.878929 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4mktd" podStartSLOduration=1.878902225 podStartE2EDuration="1.878902225s" podCreationTimestamp="2026-03-20 08:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:28:51.874310918 +0000 UTC m=+357.091211263" watchObservedRunningTime="2026-03-20 08:28:51.878902225 +0000 UTC m=+357.095802540" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.264856 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m8hxr"] Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.265756 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.265779 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.265793 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.265807 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.265831 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerName="extract-utilities" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.265848 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerName="extract-utilities" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.265879 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerName="extract-utilities" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.265896 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerName="extract-utilities" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.265911 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4742c0a9-7786-4b7e-823e-e70630e72495" containerName="marketplace-operator" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.265957 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4742c0a9-7786-4b7e-823e-e70630e72495" containerName="marketplace-operator" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.265980 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerName="extract-content" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.265995 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerName="extract-content" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.266012 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerName="extract-content" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266024 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerName="extract-content" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.266068 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerName="extract-utilities" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266080 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerName="extract-utilities" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.266095 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerName="extract-content" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266107 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerName="extract-content" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.266124 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4742c0a9-7786-4b7e-823e-e70630e72495" containerName="marketplace-operator" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266137 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4742c0a9-7786-4b7e-823e-e70630e72495" containerName="marketplace-operator" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.266152 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerName="extract-content" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266163 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerName="extract-content" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.266179 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266192 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.266211 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266223 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: E0320 08:28:52.266239 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerName="extract-utilities" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266251 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerName="extract-utilities" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266421 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="48da3e1a-ed3d-4048-8f10-39f1cc56d9af" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266439 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="738071fe-1a7b-403b-ab94-8e88d5d79ab4" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266452 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4742c0a9-7786-4b7e-823e-e70630e72495" containerName="marketplace-operator" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266468 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e74b75cf-cad8-4b67-91b7-3926096e09f8" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266484 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ce866b-65c1-454a-b346-43c2ebe9a2e0" containerName="registry-server" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.266837 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4742c0a9-7786-4b7e-823e-e70630e72495" containerName="marketplace-operator" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.267858 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.274914 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.282147 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8hxr"] Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.342704 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d874d53-f61f-48ae-96d8-dfab83476392-utilities\") pod \"redhat-marketplace-m8hxr\" (UID: \"9d874d53-f61f-48ae-96d8-dfab83476392\") " pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.342759 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v7s5\" (UniqueName: \"kubernetes.io/projected/9d874d53-f61f-48ae-96d8-dfab83476392-kube-api-access-4v7s5\") pod \"redhat-marketplace-m8hxr\" (UID: \"9d874d53-f61f-48ae-96d8-dfab83476392\") " pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.342790 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d874d53-f61f-48ae-96d8-dfab83476392-catalog-content\") pod \"redhat-marketplace-m8hxr\" (UID: \"9d874d53-f61f-48ae-96d8-dfab83476392\") " pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.443838 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d874d53-f61f-48ae-96d8-dfab83476392-utilities\") pod \"redhat-marketplace-m8hxr\" (UID: \"9d874d53-f61f-48ae-96d8-dfab83476392\") " pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.444156 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v7s5\" (UniqueName: \"kubernetes.io/projected/9d874d53-f61f-48ae-96d8-dfab83476392-kube-api-access-4v7s5\") pod \"redhat-marketplace-m8hxr\" (UID: \"9d874d53-f61f-48ae-96d8-dfab83476392\") " pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.444306 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d874d53-f61f-48ae-96d8-dfab83476392-catalog-content\") pod \"redhat-marketplace-m8hxr\" (UID: \"9d874d53-f61f-48ae-96d8-dfab83476392\") " pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.444844 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d874d53-f61f-48ae-96d8-dfab83476392-utilities\") pod \"redhat-marketplace-m8hxr\" (UID: \"9d874d53-f61f-48ae-96d8-dfab83476392\") " pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.445189 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d874d53-f61f-48ae-96d8-dfab83476392-catalog-content\") pod \"redhat-marketplace-m8hxr\" (UID: \"9d874d53-f61f-48ae-96d8-dfab83476392\") " pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.471674 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lhxhb"] Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.473491 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.479521 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.488960 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhxhb"] Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.505335 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v7s5\" (UniqueName: \"kubernetes.io/projected/9d874d53-f61f-48ae-96d8-dfab83476392-kube-api-access-4v7s5\") pod \"redhat-marketplace-m8hxr\" (UID: \"9d874d53-f61f-48ae-96d8-dfab83476392\") " pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.545309 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0135d9df-1d61-42f8-9efe-0eb2c81e5a23-utilities\") pod \"redhat-operators-lhxhb\" (UID: \"0135d9df-1d61-42f8-9efe-0eb2c81e5a23\") " pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.545538 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjltk\" (UniqueName: \"kubernetes.io/projected/0135d9df-1d61-42f8-9efe-0eb2c81e5a23-kube-api-access-fjltk\") pod \"redhat-operators-lhxhb\" (UID: \"0135d9df-1d61-42f8-9efe-0eb2c81e5a23\") " pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.545605 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0135d9df-1d61-42f8-9efe-0eb2c81e5a23-catalog-content\") pod \"redhat-operators-lhxhb\" (UID: \"0135d9df-1d61-42f8-9efe-0eb2c81e5a23\") " pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.596740 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.648740 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0135d9df-1d61-42f8-9efe-0eb2c81e5a23-utilities\") pod \"redhat-operators-lhxhb\" (UID: \"0135d9df-1d61-42f8-9efe-0eb2c81e5a23\") " pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.648816 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjltk\" (UniqueName: \"kubernetes.io/projected/0135d9df-1d61-42f8-9efe-0eb2c81e5a23-kube-api-access-fjltk\") pod \"redhat-operators-lhxhb\" (UID: \"0135d9df-1d61-42f8-9efe-0eb2c81e5a23\") " pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.648852 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0135d9df-1d61-42f8-9efe-0eb2c81e5a23-catalog-content\") pod \"redhat-operators-lhxhb\" (UID: \"0135d9df-1d61-42f8-9efe-0eb2c81e5a23\") " pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.649360 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0135d9df-1d61-42f8-9efe-0eb2c81e5a23-catalog-content\") pod \"redhat-operators-lhxhb\" (UID: \"0135d9df-1d61-42f8-9efe-0eb2c81e5a23\") " pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.649756 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0135d9df-1d61-42f8-9efe-0eb2c81e5a23-utilities\") pod \"redhat-operators-lhxhb\" (UID: \"0135d9df-1d61-42f8-9efe-0eb2c81e5a23\") " pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.677935 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjltk\" (UniqueName: \"kubernetes.io/projected/0135d9df-1d61-42f8-9efe-0eb2c81e5a23-kube-api-access-fjltk\") pod \"redhat-operators-lhxhb\" (UID: \"0135d9df-1d61-42f8-9efe-0eb2c81e5a23\") " pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.816994 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vcs44"] Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.818275 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.820861 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.842298 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vcs44"] Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.954001 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c1f6335-9ca2-48a3-b995-2ef19d271086-bound-sa-token\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.954060 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1f6335-9ca2-48a3-b995-2ef19d271086-registry-tls\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.954129 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c1f6335-9ca2-48a3-b995-2ef19d271086-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.954171 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c1f6335-9ca2-48a3-b995-2ef19d271086-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.954189 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c1f6335-9ca2-48a3-b995-2ef19d271086-registry-certificates\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.954215 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c1f6335-9ca2-48a3-b995-2ef19d271086-trusted-ca\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.954241 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzsk8\" (UniqueName: \"kubernetes.io/projected/1c1f6335-9ca2-48a3-b995-2ef19d271086-kube-api-access-vzsk8\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.954268 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:52 crc kubenswrapper[4903]: I0320 08:28:52.979499 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.055909 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c1f6335-9ca2-48a3-b995-2ef19d271086-bound-sa-token\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.055973 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1f6335-9ca2-48a3-b995-2ef19d271086-registry-tls\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.055996 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c1f6335-9ca2-48a3-b995-2ef19d271086-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.056084 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c1f6335-9ca2-48a3-b995-2ef19d271086-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.056107 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c1f6335-9ca2-48a3-b995-2ef19d271086-registry-certificates\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.056158 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c1f6335-9ca2-48a3-b995-2ef19d271086-trusted-ca\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.056184 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzsk8\" (UniqueName: \"kubernetes.io/projected/1c1f6335-9ca2-48a3-b995-2ef19d271086-kube-api-access-vzsk8\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.056791 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c1f6335-9ca2-48a3-b995-2ef19d271086-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.058140 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c1f6335-9ca2-48a3-b995-2ef19d271086-registry-certificates\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.058358 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c1f6335-9ca2-48a3-b995-2ef19d271086-trusted-ca\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.063547 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c1f6335-9ca2-48a3-b995-2ef19d271086-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.064108 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1f6335-9ca2-48a3-b995-2ef19d271086-registry-tls\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.072623 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c1f6335-9ca2-48a3-b995-2ef19d271086-bound-sa-token\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.082856 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzsk8\" (UniqueName: \"kubernetes.io/projected/1c1f6335-9ca2-48a3-b995-2ef19d271086-kube-api-access-vzsk8\") pod \"image-registry-66df7c8f76-vcs44\" (UID: \"1c1f6335-9ca2-48a3-b995-2ef19d271086\") " pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.137276 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.193935 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8hxr"] Mar 20 08:28:53 crc kubenswrapper[4903]: W0320 08:28:53.201616 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d874d53_f61f_48ae_96d8_dfab83476392.slice/crio-08966b939b0477935cc2837148d1e2a04d236356bdd031fb961674213f40ace5 WatchSource:0}: Error finding container 08966b939b0477935cc2837148d1e2a04d236356bdd031fb961674213f40ace5: Status 404 returned error can't find the container with id 08966b939b0477935cc2837148d1e2a04d236356bdd031fb961674213f40ace5 Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.268783 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhxhb"] Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.373144 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vcs44"] Mar 20 08:28:53 crc kubenswrapper[4903]: W0320 08:28:53.383735 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c1f6335_9ca2_48a3_b995_2ef19d271086.slice/crio-16d8ca2ba30a98bb939fe6635c97064870ca690765869804abaae065cf73b4e4 WatchSource:0}: Error finding container 16d8ca2ba30a98bb939fe6635c97064870ca690765869804abaae065cf73b4e4: Status 404 returned error can't find the container with id 16d8ca2ba30a98bb939fe6635c97064870ca690765869804abaae065cf73b4e4 Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.872939 4903 generic.go:334] "Generic (PLEG): container finished" podID="9d874d53-f61f-48ae-96d8-dfab83476392" containerID="e05847ab7a6c8aa9a34239200e90573d1a0caffd1340fe972707f17229eec6d9" exitCode=0 Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.873056 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8hxr" event={"ID":"9d874d53-f61f-48ae-96d8-dfab83476392","Type":"ContainerDied","Data":"e05847ab7a6c8aa9a34239200e90573d1a0caffd1340fe972707f17229eec6d9"} Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.873642 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8hxr" event={"ID":"9d874d53-f61f-48ae-96d8-dfab83476392","Type":"ContainerStarted","Data":"08966b939b0477935cc2837148d1e2a04d236356bdd031fb961674213f40ace5"} Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.875086 4903 generic.go:334] "Generic (PLEG): container finished" podID="0135d9df-1d61-42f8-9efe-0eb2c81e5a23" containerID="8e086710778f9c3ea49bb1f76e4b8f15105559934ad21140ea2560bf7e686f1c" exitCode=0 Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.875163 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhxhb" event={"ID":"0135d9df-1d61-42f8-9efe-0eb2c81e5a23","Type":"ContainerDied","Data":"8e086710778f9c3ea49bb1f76e4b8f15105559934ad21140ea2560bf7e686f1c"} Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.875226 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhxhb" event={"ID":"0135d9df-1d61-42f8-9efe-0eb2c81e5a23","Type":"ContainerStarted","Data":"feae4e51cfef66d5e520550f69a93b1dea5a1b334aa50416f42c338eede980aa"} Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.877159 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" event={"ID":"1c1f6335-9ca2-48a3-b995-2ef19d271086","Type":"ContainerStarted","Data":"67fd51b7c5ce51d98b6451dd79492635ade1b03f8b18356a7cd0051f44afaaaa"} Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.877223 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" event={"ID":"1c1f6335-9ca2-48a3-b995-2ef19d271086","Type":"ContainerStarted","Data":"16d8ca2ba30a98bb939fe6635c97064870ca690765869804abaae065cf73b4e4"} Mar 20 08:28:53 crc kubenswrapper[4903]: I0320 08:28:53.916551 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" podStartSLOduration=1.916536453 podStartE2EDuration="1.916536453s" podCreationTimestamp="2026-03-20 08:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:28:53.909312248 +0000 UTC m=+359.126212563" watchObservedRunningTime="2026-03-20 08:28:53.916536453 +0000 UTC m=+359.133436768" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.667360 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bwc4v"] Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.669212 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.674498 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.681840 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwc4v"] Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.785087 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/275937e3-335c-40d7-83b3-1e8ddf7d5c2d-utilities\") pod \"certified-operators-bwc4v\" (UID: \"275937e3-335c-40d7-83b3-1e8ddf7d5c2d\") " pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.785128 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2tnd\" (UniqueName: \"kubernetes.io/projected/275937e3-335c-40d7-83b3-1e8ddf7d5c2d-kube-api-access-m2tnd\") pod \"certified-operators-bwc4v\" (UID: \"275937e3-335c-40d7-83b3-1e8ddf7d5c2d\") " pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.785231 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/275937e3-335c-40d7-83b3-1e8ddf7d5c2d-catalog-content\") pod \"certified-operators-bwc4v\" (UID: \"275937e3-335c-40d7-83b3-1e8ddf7d5c2d\") " pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.871988 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k8gfb"] Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.873211 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.880412 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.884101 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8gfb"] Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.884952 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8hxr" event={"ID":"9d874d53-f61f-48ae-96d8-dfab83476392","Type":"ContainerStarted","Data":"c26cee465b92fb4aa07cec6ee91ce0850885f8a746ca1f036a90dac7d612c75d"} Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.886326 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/275937e3-335c-40d7-83b3-1e8ddf7d5c2d-catalog-content\") pod \"certified-operators-bwc4v\" (UID: \"275937e3-335c-40d7-83b3-1e8ddf7d5c2d\") " pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.886399 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/275937e3-335c-40d7-83b3-1e8ddf7d5c2d-utilities\") pod \"certified-operators-bwc4v\" (UID: \"275937e3-335c-40d7-83b3-1e8ddf7d5c2d\") " pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.886423 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2tnd\" (UniqueName: \"kubernetes.io/projected/275937e3-335c-40d7-83b3-1e8ddf7d5c2d-kube-api-access-m2tnd\") pod \"certified-operators-bwc4v\" (UID: \"275937e3-335c-40d7-83b3-1e8ddf7d5c2d\") " pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.886827 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/275937e3-335c-40d7-83b3-1e8ddf7d5c2d-catalog-content\") pod \"certified-operators-bwc4v\" (UID: \"275937e3-335c-40d7-83b3-1e8ddf7d5c2d\") " pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.887203 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/275937e3-335c-40d7-83b3-1e8ddf7d5c2d-utilities\") pod \"certified-operators-bwc4v\" (UID: \"275937e3-335c-40d7-83b3-1e8ddf7d5c2d\") " pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.889402 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhxhb" event={"ID":"0135d9df-1d61-42f8-9efe-0eb2c81e5a23","Type":"ContainerStarted","Data":"cd0c9a4a242844e0363c5d4a5e358b0982e9eac7554d4d10531bd8efdcd700e9"} Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.889444 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.921517 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2tnd\" (UniqueName: \"kubernetes.io/projected/275937e3-335c-40d7-83b3-1e8ddf7d5c2d-kube-api-access-m2tnd\") pod \"certified-operators-bwc4v\" (UID: \"275937e3-335c-40d7-83b3-1e8ddf7d5c2d\") " pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.987748 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-catalog-content\") pod \"community-operators-k8gfb\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.987885 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rjv2\" (UniqueName: \"kubernetes.io/projected/8a16a940-f2b4-470a-a563-4110a9756e4d-kube-api-access-7rjv2\") pod \"community-operators-k8gfb\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:54 crc kubenswrapper[4903]: I0320 08:28:54.987912 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-utilities\") pod \"community-operators-k8gfb\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.029162 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.089475 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-catalog-content\") pod \"community-operators-k8gfb\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.089545 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rjv2\" (UniqueName: \"kubernetes.io/projected/8a16a940-f2b4-470a-a563-4110a9756e4d-kube-api-access-7rjv2\") pod \"community-operators-k8gfb\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.089583 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-utilities\") pod \"community-operators-k8gfb\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.090087 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-catalog-content\") pod \"community-operators-k8gfb\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.090114 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-utilities\") pod \"community-operators-k8gfb\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.109341 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rjv2\" (UniqueName: \"kubernetes.io/projected/8a16a940-f2b4-470a-a563-4110a9756e4d-kube-api-access-7rjv2\") pod \"community-operators-k8gfb\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.190898 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.243892 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwc4v"] Mar 20 08:28:55 crc kubenswrapper[4903]: W0320 08:28:55.248836 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod275937e3_335c_40d7_83b3_1e8ddf7d5c2d.slice/crio-038d403aa53b91a58df6aba98c24b1b7b989319e82ff2f5d42f2ee142fcf7e7a WatchSource:0}: Error finding container 038d403aa53b91a58df6aba98c24b1b7b989319e82ff2f5d42f2ee142fcf7e7a: Status 404 returned error can't find the container with id 038d403aa53b91a58df6aba98c24b1b7b989319e82ff2f5d42f2ee142fcf7e7a Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.640971 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8gfb"] Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.938692 4903 generic.go:334] "Generic (PLEG): container finished" podID="9d874d53-f61f-48ae-96d8-dfab83476392" containerID="c26cee465b92fb4aa07cec6ee91ce0850885f8a746ca1f036a90dac7d612c75d" exitCode=0 Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.938794 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8hxr" event={"ID":"9d874d53-f61f-48ae-96d8-dfab83476392","Type":"ContainerDied","Data":"c26cee465b92fb4aa07cec6ee91ce0850885f8a746ca1f036a90dac7d612c75d"} Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.941716 4903 generic.go:334] "Generic (PLEG): container finished" podID="0135d9df-1d61-42f8-9efe-0eb2c81e5a23" containerID="cd0c9a4a242844e0363c5d4a5e358b0982e9eac7554d4d10531bd8efdcd700e9" exitCode=0 Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.942000 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhxhb" event={"ID":"0135d9df-1d61-42f8-9efe-0eb2c81e5a23","Type":"ContainerDied","Data":"cd0c9a4a242844e0363c5d4a5e358b0982e9eac7554d4d10531bd8efdcd700e9"} Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.952321 4903 generic.go:334] "Generic (PLEG): container finished" podID="275937e3-335c-40d7-83b3-1e8ddf7d5c2d" containerID="202c5aedacf290e3b31d1c6d36830226ebcd72ef6a68116cf78d853d79717995" exitCode=0 Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.952438 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwc4v" event={"ID":"275937e3-335c-40d7-83b3-1e8ddf7d5c2d","Type":"ContainerDied","Data":"202c5aedacf290e3b31d1c6d36830226ebcd72ef6a68116cf78d853d79717995"} Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.952473 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwc4v" event={"ID":"275937e3-335c-40d7-83b3-1e8ddf7d5c2d","Type":"ContainerStarted","Data":"038d403aa53b91a58df6aba98c24b1b7b989319e82ff2f5d42f2ee142fcf7e7a"} Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.954842 4903 generic.go:334] "Generic (PLEG): container finished" podID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerID="abbb4b0355fb3a5968a5661a3d82940c2c7b0cecbce5228a496a57c99bd122d1" exitCode=0 Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.955462 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gfb" event={"ID":"8a16a940-f2b4-470a-a563-4110a9756e4d","Type":"ContainerDied","Data":"abbb4b0355fb3a5968a5661a3d82940c2c7b0cecbce5228a496a57c99bd122d1"} Mar 20 08:28:55 crc kubenswrapper[4903]: I0320 08:28:55.955611 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gfb" event={"ID":"8a16a940-f2b4-470a-a563-4110a9756e4d","Type":"ContainerStarted","Data":"0829e850c8174c55c14ac010c86c7e7a26d6f46da96daf7025a5e0987e82e47e"} Mar 20 08:28:57 crc kubenswrapper[4903]: I0320 08:28:57.977470 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8hxr" event={"ID":"9d874d53-f61f-48ae-96d8-dfab83476392","Type":"ContainerStarted","Data":"a49ea3b090c05421c7d0438687171b55d7a07463b93f9a575305693b8b86d737"} Mar 20 08:28:57 crc kubenswrapper[4903]: I0320 08:28:57.983008 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhxhb" event={"ID":"0135d9df-1d61-42f8-9efe-0eb2c81e5a23","Type":"ContainerStarted","Data":"e78515a99c17d4adab0248d038e8fef0f8bbae1e70fbefebe92333ccc2fbc4a5"} Mar 20 08:28:57 crc kubenswrapper[4903]: I0320 08:28:57.997993 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m8hxr" podStartSLOduration=2.481999597 podStartE2EDuration="5.997971827s" podCreationTimestamp="2026-03-20 08:28:52 +0000 UTC" firstStartedPulling="2026-03-20 08:28:53.878482241 +0000 UTC m=+359.095382546" lastFinishedPulling="2026-03-20 08:28:57.394454431 +0000 UTC m=+362.611354776" observedRunningTime="2026-03-20 08:28:57.994374764 +0000 UTC m=+363.211275089" watchObservedRunningTime="2026-03-20 08:28:57.997971827 +0000 UTC m=+363.214872152" Mar 20 08:28:58 crc kubenswrapper[4903]: I0320 08:28:58.021191 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lhxhb" podStartSLOduration=2.257848261 podStartE2EDuration="6.021167169s" podCreationTimestamp="2026-03-20 08:28:52 +0000 UTC" firstStartedPulling="2026-03-20 08:28:53.877864925 +0000 UTC m=+359.094765240" lastFinishedPulling="2026-03-20 08:28:57.641183793 +0000 UTC m=+362.858084148" observedRunningTime="2026-03-20 08:28:58.01492645 +0000 UTC m=+363.231826775" watchObservedRunningTime="2026-03-20 08:28:58.021167169 +0000 UTC m=+363.238067484" Mar 20 08:28:59 crc kubenswrapper[4903]: I0320 08:28:59.008146 4903 generic.go:334] "Generic (PLEG): container finished" podID="275937e3-335c-40d7-83b3-1e8ddf7d5c2d" containerID="22a4d5e9aba181091b4204b22022eabbdfeaa2f58accae926084d61f604241f6" exitCode=0 Mar 20 08:28:59 crc kubenswrapper[4903]: I0320 08:28:59.008279 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwc4v" event={"ID":"275937e3-335c-40d7-83b3-1e8ddf7d5c2d","Type":"ContainerDied","Data":"22a4d5e9aba181091b4204b22022eabbdfeaa2f58accae926084d61f604241f6"} Mar 20 08:28:59 crc kubenswrapper[4903]: I0320 08:28:59.013942 4903 generic.go:334] "Generic (PLEG): container finished" podID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerID="ebe4a63d0ecb63b021cdcab2a2c438fae014d9380b08aa0881190f417cd07e08" exitCode=0 Mar 20 08:28:59 crc kubenswrapper[4903]: I0320 08:28:59.014087 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gfb" event={"ID":"8a16a940-f2b4-470a-a563-4110a9756e4d","Type":"ContainerDied","Data":"ebe4a63d0ecb63b021cdcab2a2c438fae014d9380b08aa0881190f417cd07e08"} Mar 20 08:29:00 crc kubenswrapper[4903]: I0320 08:29:00.024552 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwc4v" event={"ID":"275937e3-335c-40d7-83b3-1e8ddf7d5c2d","Type":"ContainerStarted","Data":"35ea383dbadb158c4f4bce9197bc2a104bc93d5e9434357d5e491b295db24227"} Mar 20 08:29:00 crc kubenswrapper[4903]: I0320 08:29:00.028430 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gfb" event={"ID":"8a16a940-f2b4-470a-a563-4110a9756e4d","Type":"ContainerStarted","Data":"5e6ea669a181b5c974ce7abdb1fe23b48de5e5dd49b113eb4051f41b105d1a9f"} Mar 20 08:29:00 crc kubenswrapper[4903]: I0320 08:29:00.060062 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bwc4v" podStartSLOduration=2.543597967 podStartE2EDuration="6.060048798s" podCreationTimestamp="2026-03-20 08:28:54 +0000 UTC" firstStartedPulling="2026-03-20 08:28:55.954235833 +0000 UTC m=+361.171136158" lastFinishedPulling="2026-03-20 08:28:59.470686664 +0000 UTC m=+364.687586989" observedRunningTime="2026-03-20 08:29:00.053995524 +0000 UTC m=+365.270895839" watchObservedRunningTime="2026-03-20 08:29:00.060048798 +0000 UTC m=+365.276949113" Mar 20 08:29:02 crc kubenswrapper[4903]: I0320 08:29:02.598122 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:29:02 crc kubenswrapper[4903]: I0320 08:29:02.598760 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:29:02 crc kubenswrapper[4903]: I0320 08:29:02.681679 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:29:02 crc kubenswrapper[4903]: I0320 08:29:02.707855 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k8gfb" podStartSLOduration=5.237919339 podStartE2EDuration="8.707823341s" podCreationTimestamp="2026-03-20 08:28:54 +0000 UTC" firstStartedPulling="2026-03-20 08:28:55.960619976 +0000 UTC m=+361.177520301" lastFinishedPulling="2026-03-20 08:28:59.430523988 +0000 UTC m=+364.647424303" observedRunningTime="2026-03-20 08:29:00.07852008 +0000 UTC m=+365.295420405" watchObservedRunningTime="2026-03-20 08:29:02.707823341 +0000 UTC m=+367.924723666" Mar 20 08:29:02 crc kubenswrapper[4903]: I0320 08:29:02.822721 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:29:02 crc kubenswrapper[4903]: I0320 08:29:02.822801 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:29:03 crc kubenswrapper[4903]: I0320 08:29:03.105968 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m8hxr" Mar 20 08:29:03 crc kubenswrapper[4903]: I0320 08:29:03.883812 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lhxhb" podUID="0135d9df-1d61-42f8-9efe-0eb2c81e5a23" containerName="registry-server" probeResult="failure" output=< Mar 20 08:29:03 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Mar 20 08:29:03 crc kubenswrapper[4903]: > Mar 20 08:29:05 crc kubenswrapper[4903]: I0320 08:29:05.030088 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:29:05 crc kubenswrapper[4903]: I0320 08:29:05.030733 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:29:05 crc kubenswrapper[4903]: I0320 08:29:05.075254 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:29:05 crc kubenswrapper[4903]: I0320 08:29:05.126353 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bwc4v" Mar 20 08:29:05 crc kubenswrapper[4903]: I0320 08:29:05.191599 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:29:05 crc kubenswrapper[4903]: I0320 08:29:05.191654 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:29:05 crc kubenswrapper[4903]: I0320 08:29:05.230666 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:29:06 crc kubenswrapper[4903]: I0320 08:29:06.149998 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:29:12 crc kubenswrapper[4903]: I0320 08:29:12.891012 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:29:12 crc kubenswrapper[4903]: I0320 08:29:12.966926 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lhxhb" Mar 20 08:29:13 crc kubenswrapper[4903]: I0320 08:29:13.147330 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vcs44" Mar 20 08:29:13 crc kubenswrapper[4903]: I0320 08:29:13.226003 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2q9m5"] Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.270827 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" podUID="5778224c-9b34-45c0-9812-122b95cef431" containerName="registry" containerID="cri-o://038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703" gracePeriod=30 Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.705279 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.751620 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-bound-sa-token\") pod \"5778224c-9b34-45c0-9812-122b95cef431\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.753316 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5778224c-9b34-45c0-9812-122b95cef431-installation-pull-secrets\") pod \"5778224c-9b34-45c0-9812-122b95cef431\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.753405 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-registry-certificates\") pod \"5778224c-9b34-45c0-9812-122b95cef431\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.753675 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5778224c-9b34-45c0-9812-122b95cef431\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.753724 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-registry-tls\") pod \"5778224c-9b34-45c0-9812-122b95cef431\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.753794 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvd5x\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-kube-api-access-zvd5x\") pod \"5778224c-9b34-45c0-9812-122b95cef431\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.753889 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5778224c-9b34-45c0-9812-122b95cef431-ca-trust-extracted\") pod \"5778224c-9b34-45c0-9812-122b95cef431\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.753996 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-trusted-ca\") pod \"5778224c-9b34-45c0-9812-122b95cef431\" (UID: \"5778224c-9b34-45c0-9812-122b95cef431\") " Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.754953 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5778224c-9b34-45c0-9812-122b95cef431" (UID: "5778224c-9b34-45c0-9812-122b95cef431"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.755024 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5778224c-9b34-45c0-9812-122b95cef431" (UID: "5778224c-9b34-45c0-9812-122b95cef431"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.761490 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-kube-api-access-zvd5x" (OuterVolumeSpecName: "kube-api-access-zvd5x") pod "5778224c-9b34-45c0-9812-122b95cef431" (UID: "5778224c-9b34-45c0-9812-122b95cef431"). InnerVolumeSpecName "kube-api-access-zvd5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.761864 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5778224c-9b34-45c0-9812-122b95cef431-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5778224c-9b34-45c0-9812-122b95cef431" (UID: "5778224c-9b34-45c0-9812-122b95cef431"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.762173 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5778224c-9b34-45c0-9812-122b95cef431" (UID: "5778224c-9b34-45c0-9812-122b95cef431"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.763101 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5778224c-9b34-45c0-9812-122b95cef431" (UID: "5778224c-9b34-45c0-9812-122b95cef431"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.775194 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5778224c-9b34-45c0-9812-122b95cef431" (UID: "5778224c-9b34-45c0-9812-122b95cef431"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.784130 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5778224c-9b34-45c0-9812-122b95cef431-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5778224c-9b34-45c0-9812-122b95cef431" (UID: "5778224c-9b34-45c0-9812-122b95cef431"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.856576 4903 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.856749 4903 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5778224c-9b34-45c0-9812-122b95cef431-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.856768 4903 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.856783 4903 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.856798 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvd5x\" (UniqueName: \"kubernetes.io/projected/5778224c-9b34-45c0-9812-122b95cef431-kube-api-access-zvd5x\") on node \"crc\" DevicePath \"\"" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.856812 4903 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5778224c-9b34-45c0-9812-122b95cef431-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 08:29:38 crc kubenswrapper[4903]: I0320 08:29:38.856823 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5778224c-9b34-45c0-9812-122b95cef431-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:29:39 crc kubenswrapper[4903]: I0320 08:29:39.294672 4903 generic.go:334] "Generic (PLEG): container finished" podID="5778224c-9b34-45c0-9812-122b95cef431" containerID="038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703" exitCode=0 Mar 20 08:29:39 crc kubenswrapper[4903]: I0320 08:29:39.294773 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" Mar 20 08:29:39 crc kubenswrapper[4903]: I0320 08:29:39.294757 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" event={"ID":"5778224c-9b34-45c0-9812-122b95cef431","Type":"ContainerDied","Data":"038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703"} Mar 20 08:29:39 crc kubenswrapper[4903]: I0320 08:29:39.295021 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-2q9m5" event={"ID":"5778224c-9b34-45c0-9812-122b95cef431","Type":"ContainerDied","Data":"cba04f9c202f29f8045f83f8d6083d98104dfccd87e37c8517e2277fbbc59e2b"} Mar 20 08:29:39 crc kubenswrapper[4903]: I0320 08:29:39.295100 4903 scope.go:117] "RemoveContainer" containerID="038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703" Mar 20 08:29:39 crc kubenswrapper[4903]: I0320 08:29:39.323023 4903 scope.go:117] "RemoveContainer" containerID="038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703" Mar 20 08:29:39 crc kubenswrapper[4903]: E0320 08:29:39.324205 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703\": container with ID starting with 038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703 not found: ID does not exist" containerID="038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703" Mar 20 08:29:39 crc kubenswrapper[4903]: I0320 08:29:39.324289 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703"} err="failed to get container status \"038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703\": rpc error: code = NotFound desc = could not find container \"038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703\": container with ID starting with 038238e01b2ec209b308c0ac95bdcbb36ad6f721503a7e8b7c5cb2f86d139703 not found: ID does not exist" Mar 20 08:29:39 crc kubenswrapper[4903]: I0320 08:29:39.354399 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2q9m5"] Mar 20 08:29:39 crc kubenswrapper[4903]: I0320 08:29:39.359792 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-2q9m5"] Mar 20 08:29:39 crc kubenswrapper[4903]: I0320 08:29:39.506226 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5778224c-9b34-45c0-9812-122b95cef431" path="/var/lib/kubelet/pods/5778224c-9b34-45c0-9812-122b95cef431/volumes" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.151160 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk"] Mar 20 08:30:00 crc kubenswrapper[4903]: E0320 08:30:00.152427 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5778224c-9b34-45c0-9812-122b95cef431" containerName="registry" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.152450 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5778224c-9b34-45c0-9812-122b95cef431" containerName="registry" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.152639 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5778224c-9b34-45c0-9812-122b95cef431" containerName="registry" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.153360 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.155953 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566590-ws8nx"] Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.155977 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.156936 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566590-ws8nx" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.160887 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.168557 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.168990 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.169166 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.170404 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566590-ws8nx"] Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.185379 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk"] Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.313429 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5kjn\" (UniqueName: \"kubernetes.io/projected/3e7d925a-a144-49cd-a061-95a3041145b9-kube-api-access-f5kjn\") pod \"collect-profiles-29566590-kcdfk\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.313587 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qzb9\" (UniqueName: \"kubernetes.io/projected/5fd43bc1-7eee-4b42-b9bf-75765d8f28b5-kube-api-access-8qzb9\") pod \"auto-csr-approver-29566590-ws8nx\" (UID: \"5fd43bc1-7eee-4b42-b9bf-75765d8f28b5\") " pod="openshift-infra/auto-csr-approver-29566590-ws8nx" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.313735 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e7d925a-a144-49cd-a061-95a3041145b9-secret-volume\") pod \"collect-profiles-29566590-kcdfk\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.313789 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e7d925a-a144-49cd-a061-95a3041145b9-config-volume\") pod \"collect-profiles-29566590-kcdfk\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.415541 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qzb9\" (UniqueName: \"kubernetes.io/projected/5fd43bc1-7eee-4b42-b9bf-75765d8f28b5-kube-api-access-8qzb9\") pod \"auto-csr-approver-29566590-ws8nx\" (UID: \"5fd43bc1-7eee-4b42-b9bf-75765d8f28b5\") " pod="openshift-infra/auto-csr-approver-29566590-ws8nx" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.415666 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e7d925a-a144-49cd-a061-95a3041145b9-secret-volume\") pod \"collect-profiles-29566590-kcdfk\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.415708 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e7d925a-a144-49cd-a061-95a3041145b9-config-volume\") pod \"collect-profiles-29566590-kcdfk\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.415809 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5kjn\" (UniqueName: \"kubernetes.io/projected/3e7d925a-a144-49cd-a061-95a3041145b9-kube-api-access-f5kjn\") pod \"collect-profiles-29566590-kcdfk\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.417739 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e7d925a-a144-49cd-a061-95a3041145b9-config-volume\") pod \"collect-profiles-29566590-kcdfk\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.424726 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e7d925a-a144-49cd-a061-95a3041145b9-secret-volume\") pod \"collect-profiles-29566590-kcdfk\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.443637 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qzb9\" (UniqueName: \"kubernetes.io/projected/5fd43bc1-7eee-4b42-b9bf-75765d8f28b5-kube-api-access-8qzb9\") pod \"auto-csr-approver-29566590-ws8nx\" (UID: \"5fd43bc1-7eee-4b42-b9bf-75765d8f28b5\") " pod="openshift-infra/auto-csr-approver-29566590-ws8nx" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.446329 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5kjn\" (UniqueName: \"kubernetes.io/projected/3e7d925a-a144-49cd-a061-95a3041145b9-kube-api-access-f5kjn\") pod \"collect-profiles-29566590-kcdfk\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.474354 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.484683 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566590-ws8nx" Mar 20 08:30:00 crc kubenswrapper[4903]: I0320 08:30:00.744492 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk"] Mar 20 08:30:01 crc kubenswrapper[4903]: I0320 08:30:00.817577 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566590-ws8nx"] Mar 20 08:30:01 crc kubenswrapper[4903]: W0320 08:30:00.828240 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fd43bc1_7eee_4b42_b9bf_75765d8f28b5.slice/crio-b2bc5019e686ddd9ec25c435a0034ef3623806ec355fb08aab125d84475f8158 WatchSource:0}: Error finding container b2bc5019e686ddd9ec25c435a0034ef3623806ec355fb08aab125d84475f8158: Status 404 returned error can't find the container with id b2bc5019e686ddd9ec25c435a0034ef3623806ec355fb08aab125d84475f8158 Mar 20 08:30:01 crc kubenswrapper[4903]: I0320 08:30:01.478401 4903 generic.go:334] "Generic (PLEG): container finished" podID="3e7d925a-a144-49cd-a061-95a3041145b9" containerID="8025abeb1d5e5fe5577466b86a2e7817694337715c2c3c53390deaa90e298ff1" exitCode=0 Mar 20 08:30:01 crc kubenswrapper[4903]: I0320 08:30:01.478535 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" event={"ID":"3e7d925a-a144-49cd-a061-95a3041145b9","Type":"ContainerDied","Data":"8025abeb1d5e5fe5577466b86a2e7817694337715c2c3c53390deaa90e298ff1"} Mar 20 08:30:01 crc kubenswrapper[4903]: I0320 08:30:01.478583 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" event={"ID":"3e7d925a-a144-49cd-a061-95a3041145b9","Type":"ContainerStarted","Data":"806ddf4e8127c8efe3b9fa28e0e23db20716995ba242b3aec14240655ec37c26"} Mar 20 08:30:01 crc kubenswrapper[4903]: I0320 08:30:01.481118 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566590-ws8nx" event={"ID":"5fd43bc1-7eee-4b42-b9bf-75765d8f28b5","Type":"ContainerStarted","Data":"b2bc5019e686ddd9ec25c435a0034ef3623806ec355fb08aab125d84475f8158"} Mar 20 08:30:02 crc kubenswrapper[4903]: I0320 08:30:02.493491 4903 generic.go:334] "Generic (PLEG): container finished" podID="5fd43bc1-7eee-4b42-b9bf-75765d8f28b5" containerID="6bcc71e401a297d75b4f6b8b25efdf4d606f0e2200ac1e8e2b2a19909a6c66d9" exitCode=0 Mar 20 08:30:02 crc kubenswrapper[4903]: I0320 08:30:02.493600 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566590-ws8nx" event={"ID":"5fd43bc1-7eee-4b42-b9bf-75765d8f28b5","Type":"ContainerDied","Data":"6bcc71e401a297d75b4f6b8b25efdf4d606f0e2200ac1e8e2b2a19909a6c66d9"} Mar 20 08:30:02 crc kubenswrapper[4903]: I0320 08:30:02.855658 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:02 crc kubenswrapper[4903]: I0320 08:30:02.959996 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5kjn\" (UniqueName: \"kubernetes.io/projected/3e7d925a-a144-49cd-a061-95a3041145b9-kube-api-access-f5kjn\") pod \"3e7d925a-a144-49cd-a061-95a3041145b9\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " Mar 20 08:30:02 crc kubenswrapper[4903]: I0320 08:30:02.960201 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e7d925a-a144-49cd-a061-95a3041145b9-config-volume\") pod \"3e7d925a-a144-49cd-a061-95a3041145b9\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " Mar 20 08:30:02 crc kubenswrapper[4903]: I0320 08:30:02.960288 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e7d925a-a144-49cd-a061-95a3041145b9-secret-volume\") pod \"3e7d925a-a144-49cd-a061-95a3041145b9\" (UID: \"3e7d925a-a144-49cd-a061-95a3041145b9\") " Mar 20 08:30:02 crc kubenswrapper[4903]: I0320 08:30:02.961404 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e7d925a-a144-49cd-a061-95a3041145b9-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e7d925a-a144-49cd-a061-95a3041145b9" (UID: "3e7d925a-a144-49cd-a061-95a3041145b9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:30:02 crc kubenswrapper[4903]: I0320 08:30:02.970164 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7d925a-a144-49cd-a061-95a3041145b9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e7d925a-a144-49cd-a061-95a3041145b9" (UID: "3e7d925a-a144-49cd-a061-95a3041145b9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:30:02 crc kubenswrapper[4903]: I0320 08:30:02.970171 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7d925a-a144-49cd-a061-95a3041145b9-kube-api-access-f5kjn" (OuterVolumeSpecName: "kube-api-access-f5kjn") pod "3e7d925a-a144-49cd-a061-95a3041145b9" (UID: "3e7d925a-a144-49cd-a061-95a3041145b9"). InnerVolumeSpecName "kube-api-access-f5kjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:30:03 crc kubenswrapper[4903]: I0320 08:30:03.062665 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5kjn\" (UniqueName: \"kubernetes.io/projected/3e7d925a-a144-49cd-a061-95a3041145b9-kube-api-access-f5kjn\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:03 crc kubenswrapper[4903]: I0320 08:30:03.062787 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e7d925a-a144-49cd-a061-95a3041145b9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:03 crc kubenswrapper[4903]: I0320 08:30:03.062815 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e7d925a-a144-49cd-a061-95a3041145b9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:03 crc kubenswrapper[4903]: I0320 08:30:03.508408 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" event={"ID":"3e7d925a-a144-49cd-a061-95a3041145b9","Type":"ContainerDied","Data":"806ddf4e8127c8efe3b9fa28e0e23db20716995ba242b3aec14240655ec37c26"} Mar 20 08:30:03 crc kubenswrapper[4903]: I0320 08:30:03.508490 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="806ddf4e8127c8efe3b9fa28e0e23db20716995ba242b3aec14240655ec37c26" Mar 20 08:30:03 crc kubenswrapper[4903]: I0320 08:30:03.508685 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk" Mar 20 08:30:03 crc kubenswrapper[4903]: I0320 08:30:03.901149 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566590-ws8nx" Mar 20 08:30:04 crc kubenswrapper[4903]: I0320 08:30:04.015527 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qzb9\" (UniqueName: \"kubernetes.io/projected/5fd43bc1-7eee-4b42-b9bf-75765d8f28b5-kube-api-access-8qzb9\") pod \"5fd43bc1-7eee-4b42-b9bf-75765d8f28b5\" (UID: \"5fd43bc1-7eee-4b42-b9bf-75765d8f28b5\") " Mar 20 08:30:04 crc kubenswrapper[4903]: I0320 08:30:04.021749 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fd43bc1-7eee-4b42-b9bf-75765d8f28b5-kube-api-access-8qzb9" (OuterVolumeSpecName: "kube-api-access-8qzb9") pod "5fd43bc1-7eee-4b42-b9bf-75765d8f28b5" (UID: "5fd43bc1-7eee-4b42-b9bf-75765d8f28b5"). InnerVolumeSpecName "kube-api-access-8qzb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:30:04 crc kubenswrapper[4903]: I0320 08:30:04.117243 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qzb9\" (UniqueName: \"kubernetes.io/projected/5fd43bc1-7eee-4b42-b9bf-75765d8f28b5-kube-api-access-8qzb9\") on node \"crc\" DevicePath \"\"" Mar 20 08:30:04 crc kubenswrapper[4903]: I0320 08:30:04.520109 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566590-ws8nx" event={"ID":"5fd43bc1-7eee-4b42-b9bf-75765d8f28b5","Type":"ContainerDied","Data":"b2bc5019e686ddd9ec25c435a0034ef3623806ec355fb08aab125d84475f8158"} Mar 20 08:30:04 crc kubenswrapper[4903]: I0320 08:30:04.520203 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2bc5019e686ddd9ec25c435a0034ef3623806ec355fb08aab125d84475f8158" Mar 20 08:30:04 crc kubenswrapper[4903]: I0320 08:30:04.520208 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566590-ws8nx" Mar 20 08:30:50 crc kubenswrapper[4903]: I0320 08:30:50.833578 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:30:50 crc kubenswrapper[4903]: I0320 08:30:50.834448 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:31:20 crc kubenswrapper[4903]: I0320 08:31:20.834691 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:31:20 crc kubenswrapper[4903]: I0320 08:31:20.835672 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:31:50 crc kubenswrapper[4903]: I0320 08:31:50.834552 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:31:50 crc kubenswrapper[4903]: I0320 08:31:50.835569 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:31:50 crc kubenswrapper[4903]: I0320 08:31:50.835654 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:31:50 crc kubenswrapper[4903]: I0320 08:31:50.836641 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fdfb30d5b87cf452b7cc070a07a38b8d79152d6ba5ebf1ea0c265e7f4d3d787"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:31:50 crc kubenswrapper[4903]: I0320 08:31:50.836727 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://6fdfb30d5b87cf452b7cc070a07a38b8d79152d6ba5ebf1ea0c265e7f4d3d787" gracePeriod=600 Mar 20 08:31:51 crc kubenswrapper[4903]: I0320 08:31:51.354197 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="6fdfb30d5b87cf452b7cc070a07a38b8d79152d6ba5ebf1ea0c265e7f4d3d787" exitCode=0 Mar 20 08:31:51 crc kubenswrapper[4903]: I0320 08:31:51.354268 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"6fdfb30d5b87cf452b7cc070a07a38b8d79152d6ba5ebf1ea0c265e7f4d3d787"} Mar 20 08:31:51 crc kubenswrapper[4903]: I0320 08:31:51.354717 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"d4e808adab9ca608f4234405ae0edc403a12d33d968b79be08acd008cc246023"} Mar 20 08:31:51 crc kubenswrapper[4903]: I0320 08:31:51.354742 4903 scope.go:117] "RemoveContainer" containerID="97230b86d8abf05de23db14ac7a3f5d775800a1072bcb8f41fc0bb22c84b0942" Mar 20 08:31:56 crc kubenswrapper[4903]: I0320 08:31:56.318849 4903 scope.go:117] "RemoveContainer" containerID="bc6dd320320c485629531c527a64044601a8fab7d36498f5627d22725d511d21" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.151825 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566592-jvzjg"] Mar 20 08:32:00 crc kubenswrapper[4903]: E0320 08:32:00.152725 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd43bc1-7eee-4b42-b9bf-75765d8f28b5" containerName="oc" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.152752 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd43bc1-7eee-4b42-b9bf-75765d8f28b5" containerName="oc" Mar 20 08:32:00 crc kubenswrapper[4903]: E0320 08:32:00.152787 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7d925a-a144-49cd-a061-95a3041145b9" containerName="collect-profiles" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.152799 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7d925a-a144-49cd-a061-95a3041145b9" containerName="collect-profiles" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.152960 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7d925a-a144-49cd-a061-95a3041145b9" containerName="collect-profiles" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.152985 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd43bc1-7eee-4b42-b9bf-75765d8f28b5" containerName="oc" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.153579 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566592-jvzjg" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.156203 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.156803 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.160206 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.161561 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566592-jvzjg"] Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.314277 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxdgp\" (UniqueName: \"kubernetes.io/projected/a4fb82db-1d81-491c-8fb7-8a6072b3335a-kube-api-access-gxdgp\") pod \"auto-csr-approver-29566592-jvzjg\" (UID: \"a4fb82db-1d81-491c-8fb7-8a6072b3335a\") " pod="openshift-infra/auto-csr-approver-29566592-jvzjg" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.416493 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxdgp\" (UniqueName: \"kubernetes.io/projected/a4fb82db-1d81-491c-8fb7-8a6072b3335a-kube-api-access-gxdgp\") pod \"auto-csr-approver-29566592-jvzjg\" (UID: \"a4fb82db-1d81-491c-8fb7-8a6072b3335a\") " pod="openshift-infra/auto-csr-approver-29566592-jvzjg" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.453558 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxdgp\" (UniqueName: \"kubernetes.io/projected/a4fb82db-1d81-491c-8fb7-8a6072b3335a-kube-api-access-gxdgp\") pod \"auto-csr-approver-29566592-jvzjg\" (UID: \"a4fb82db-1d81-491c-8fb7-8a6072b3335a\") " pod="openshift-infra/auto-csr-approver-29566592-jvzjg" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.486986 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566592-jvzjg" Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.776094 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566592-jvzjg"] Mar 20 08:32:00 crc kubenswrapper[4903]: I0320 08:32:00.788258 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:32:01 crc kubenswrapper[4903]: I0320 08:32:01.439674 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566592-jvzjg" event={"ID":"a4fb82db-1d81-491c-8fb7-8a6072b3335a","Type":"ContainerStarted","Data":"a76886d52f338412d60e72a5d93f4a85b04336cbba64201ca8ecfb1fc809a923"} Mar 20 08:32:02 crc kubenswrapper[4903]: I0320 08:32:02.451850 4903 generic.go:334] "Generic (PLEG): container finished" podID="a4fb82db-1d81-491c-8fb7-8a6072b3335a" containerID="096b2f20a01d60e5013375f69c38f59a73a6f708def95ccb0592a5e2ce5f7afd" exitCode=0 Mar 20 08:32:02 crc kubenswrapper[4903]: I0320 08:32:02.451918 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566592-jvzjg" event={"ID":"a4fb82db-1d81-491c-8fb7-8a6072b3335a","Type":"ContainerDied","Data":"096b2f20a01d60e5013375f69c38f59a73a6f708def95ccb0592a5e2ce5f7afd"} Mar 20 08:32:03 crc kubenswrapper[4903]: I0320 08:32:03.776690 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566592-jvzjg" Mar 20 08:32:03 crc kubenswrapper[4903]: I0320 08:32:03.869989 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxdgp\" (UniqueName: \"kubernetes.io/projected/a4fb82db-1d81-491c-8fb7-8a6072b3335a-kube-api-access-gxdgp\") pod \"a4fb82db-1d81-491c-8fb7-8a6072b3335a\" (UID: \"a4fb82db-1d81-491c-8fb7-8a6072b3335a\") " Mar 20 08:32:03 crc kubenswrapper[4903]: I0320 08:32:03.879649 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fb82db-1d81-491c-8fb7-8a6072b3335a-kube-api-access-gxdgp" (OuterVolumeSpecName: "kube-api-access-gxdgp") pod "a4fb82db-1d81-491c-8fb7-8a6072b3335a" (UID: "a4fb82db-1d81-491c-8fb7-8a6072b3335a"). InnerVolumeSpecName "kube-api-access-gxdgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:32:03 crc kubenswrapper[4903]: I0320 08:32:03.972335 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxdgp\" (UniqueName: \"kubernetes.io/projected/a4fb82db-1d81-491c-8fb7-8a6072b3335a-kube-api-access-gxdgp\") on node \"crc\" DevicePath \"\"" Mar 20 08:32:04 crc kubenswrapper[4903]: I0320 08:32:04.471255 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566592-jvzjg" event={"ID":"a4fb82db-1d81-491c-8fb7-8a6072b3335a","Type":"ContainerDied","Data":"a76886d52f338412d60e72a5d93f4a85b04336cbba64201ca8ecfb1fc809a923"} Mar 20 08:32:04 crc kubenswrapper[4903]: I0320 08:32:04.471325 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a76886d52f338412d60e72a5d93f4a85b04336cbba64201ca8ecfb1fc809a923" Mar 20 08:32:04 crc kubenswrapper[4903]: I0320 08:32:04.471393 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566592-jvzjg" Mar 20 08:32:04 crc kubenswrapper[4903]: I0320 08:32:04.860898 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566586-87jzv"] Mar 20 08:32:04 crc kubenswrapper[4903]: I0320 08:32:04.867778 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566586-87jzv"] Mar 20 08:32:05 crc kubenswrapper[4903]: I0320 08:32:05.501170 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f" path="/var/lib/kubelet/pods/a5f2446d-2562-46b0-9bdd-6d5bf42d1a7f/volumes" Mar 20 08:32:56 crc kubenswrapper[4903]: I0320 08:32:56.368832 4903 scope.go:117] "RemoveContainer" containerID="781971432a2d34c14a048a7ba8a4fc1acb64225c8b94b7eebf0b09d61d19665b" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.152689 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566594-drppl"] Mar 20 08:34:00 crc kubenswrapper[4903]: E0320 08:34:00.154413 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fb82db-1d81-491c-8fb7-8a6072b3335a" containerName="oc" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.154435 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fb82db-1d81-491c-8fb7-8a6072b3335a" containerName="oc" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.154595 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fb82db-1d81-491c-8fb7-8a6072b3335a" containerName="oc" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.155109 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566594-drppl" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.157168 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.161138 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566594-drppl"] Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.165241 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.165250 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.170917 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s697n\" (UniqueName: \"kubernetes.io/projected/2424a689-158b-42a5-805c-47dbf5dd3203-kube-api-access-s697n\") pod \"auto-csr-approver-29566594-drppl\" (UID: \"2424a689-158b-42a5-805c-47dbf5dd3203\") " pod="openshift-infra/auto-csr-approver-29566594-drppl" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.271538 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s697n\" (UniqueName: \"kubernetes.io/projected/2424a689-158b-42a5-805c-47dbf5dd3203-kube-api-access-s697n\") pod \"auto-csr-approver-29566594-drppl\" (UID: \"2424a689-158b-42a5-805c-47dbf5dd3203\") " pod="openshift-infra/auto-csr-approver-29566594-drppl" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.302576 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s697n\" (UniqueName: \"kubernetes.io/projected/2424a689-158b-42a5-805c-47dbf5dd3203-kube-api-access-s697n\") pod \"auto-csr-approver-29566594-drppl\" (UID: \"2424a689-158b-42a5-805c-47dbf5dd3203\") " pod="openshift-infra/auto-csr-approver-29566594-drppl" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.476628 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566594-drppl" Mar 20 08:34:00 crc kubenswrapper[4903]: I0320 08:34:00.751528 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566594-drppl"] Mar 20 08:34:01 crc kubenswrapper[4903]: I0320 08:34:01.372790 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566594-drppl" event={"ID":"2424a689-158b-42a5-805c-47dbf5dd3203","Type":"ContainerStarted","Data":"63ce07f8009cb80c48a71685a6cfec922479bb0fc36e585e7c2d5310f7762d5b"} Mar 20 08:34:02 crc kubenswrapper[4903]: I0320 08:34:02.381140 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566594-drppl" event={"ID":"2424a689-158b-42a5-805c-47dbf5dd3203","Type":"ContainerStarted","Data":"11a9fb4c401a7bb0b4717c57cc5525846cd3b28f869642be3c74e0b6a86eaf93"} Mar 20 08:34:02 crc kubenswrapper[4903]: I0320 08:34:02.411356 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566594-drppl" podStartSLOduration=1.369158573 podStartE2EDuration="2.411321866s" podCreationTimestamp="2026-03-20 08:34:00 +0000 UTC" firstStartedPulling="2026-03-20 08:34:00.766922038 +0000 UTC m=+665.983822353" lastFinishedPulling="2026-03-20 08:34:01.809085331 +0000 UTC m=+667.025985646" observedRunningTime="2026-03-20 08:34:02.405751347 +0000 UTC m=+667.622651702" watchObservedRunningTime="2026-03-20 08:34:02.411321866 +0000 UTC m=+667.628222171" Mar 20 08:34:03 crc kubenswrapper[4903]: I0320 08:34:03.392620 4903 generic.go:334] "Generic (PLEG): container finished" podID="2424a689-158b-42a5-805c-47dbf5dd3203" containerID="11a9fb4c401a7bb0b4717c57cc5525846cd3b28f869642be3c74e0b6a86eaf93" exitCode=0 Mar 20 08:34:03 crc kubenswrapper[4903]: I0320 08:34:03.392763 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566594-drppl" event={"ID":"2424a689-158b-42a5-805c-47dbf5dd3203","Type":"ContainerDied","Data":"11a9fb4c401a7bb0b4717c57cc5525846cd3b28f869642be3c74e0b6a86eaf93"} Mar 20 08:34:04 crc kubenswrapper[4903]: I0320 08:34:04.708437 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566594-drppl" Mar 20 08:34:04 crc kubenswrapper[4903]: I0320 08:34:04.840954 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s697n\" (UniqueName: \"kubernetes.io/projected/2424a689-158b-42a5-805c-47dbf5dd3203-kube-api-access-s697n\") pod \"2424a689-158b-42a5-805c-47dbf5dd3203\" (UID: \"2424a689-158b-42a5-805c-47dbf5dd3203\") " Mar 20 08:34:04 crc kubenswrapper[4903]: I0320 08:34:04.848024 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2424a689-158b-42a5-805c-47dbf5dd3203-kube-api-access-s697n" (OuterVolumeSpecName: "kube-api-access-s697n") pod "2424a689-158b-42a5-805c-47dbf5dd3203" (UID: "2424a689-158b-42a5-805c-47dbf5dd3203"). InnerVolumeSpecName "kube-api-access-s697n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:34:04 crc kubenswrapper[4903]: I0320 08:34:04.942827 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s697n\" (UniqueName: \"kubernetes.io/projected/2424a689-158b-42a5-805c-47dbf5dd3203-kube-api-access-s697n\") on node \"crc\" DevicePath \"\"" Mar 20 08:34:05 crc kubenswrapper[4903]: I0320 08:34:05.411071 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566594-drppl" event={"ID":"2424a689-158b-42a5-805c-47dbf5dd3203","Type":"ContainerDied","Data":"63ce07f8009cb80c48a71685a6cfec922479bb0fc36e585e7c2d5310f7762d5b"} Mar 20 08:34:05 crc kubenswrapper[4903]: I0320 08:34:05.411132 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63ce07f8009cb80c48a71685a6cfec922479bb0fc36e585e7c2d5310f7762d5b" Mar 20 08:34:05 crc kubenswrapper[4903]: I0320 08:34:05.411272 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566594-drppl" Mar 20 08:34:05 crc kubenswrapper[4903]: I0320 08:34:05.489607 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566588-4mjc5"] Mar 20 08:34:05 crc kubenswrapper[4903]: I0320 08:34:05.506924 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566588-4mjc5"] Mar 20 08:34:07 crc kubenswrapper[4903]: I0320 08:34:07.502737 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a63c6845-cfdb-46dc-8ab0-39c7d1a366d2" path="/var/lib/kubelet/pods/a63c6845-cfdb-46dc-8ab0-39c7d1a366d2/volumes" Mar 20 08:34:20 crc kubenswrapper[4903]: I0320 08:34:20.833808 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:34:20 crc kubenswrapper[4903]: I0320 08:34:20.835229 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:34:50 crc kubenswrapper[4903]: I0320 08:34:50.833881 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:34:50 crc kubenswrapper[4903]: I0320 08:34:50.834701 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:34:56 crc kubenswrapper[4903]: I0320 08:34:56.434172 4903 scope.go:117] "RemoveContainer" containerID="3a6045dca4e996436358344c9adc794f5c5a7b271aa6124aca1f8627ecba66d4" Mar 20 08:35:20 crc kubenswrapper[4903]: I0320 08:35:20.834023 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:35:20 crc kubenswrapper[4903]: I0320 08:35:20.835023 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:35:20 crc kubenswrapper[4903]: I0320 08:35:20.835129 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:35:20 crc kubenswrapper[4903]: I0320 08:35:20.836105 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4e808adab9ca608f4234405ae0edc403a12d33d968b79be08acd008cc246023"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:35:20 crc kubenswrapper[4903]: I0320 08:35:20.836203 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://d4e808adab9ca608f4234405ae0edc403a12d33d968b79be08acd008cc246023" gracePeriod=600 Mar 20 08:35:21 crc kubenswrapper[4903]: I0320 08:35:21.005195 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="d4e808adab9ca608f4234405ae0edc403a12d33d968b79be08acd008cc246023" exitCode=0 Mar 20 08:35:21 crc kubenswrapper[4903]: I0320 08:35:21.005287 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"d4e808adab9ca608f4234405ae0edc403a12d33d968b79be08acd008cc246023"} Mar 20 08:35:21 crc kubenswrapper[4903]: I0320 08:35:21.005353 4903 scope.go:117] "RemoveContainer" containerID="6fdfb30d5b87cf452b7cc070a07a38b8d79152d6ba5ebf1ea0c265e7f4d3d787" Mar 20 08:35:22 crc kubenswrapper[4903]: I0320 08:35:22.016190 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"5d4eaa665a94ad4629d1882b24e933477ef11d2d020096b0cd7d0be400cb4301"} Mar 20 08:35:32 crc kubenswrapper[4903]: I0320 08:35:32.113554 4903 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:35:36 crc kubenswrapper[4903]: I0320 08:35:36.947023 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sxt6c"] Mar 20 08:35:36 crc kubenswrapper[4903]: E0320 08:35:36.950383 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2424a689-158b-42a5-805c-47dbf5dd3203" containerName="oc" Mar 20 08:35:36 crc kubenswrapper[4903]: I0320 08:35:36.950422 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2424a689-158b-42a5-805c-47dbf5dd3203" containerName="oc" Mar 20 08:35:36 crc kubenswrapper[4903]: I0320 08:35:36.950569 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2424a689-158b-42a5-805c-47dbf5dd3203" containerName="oc" Mar 20 08:35:36 crc kubenswrapper[4903]: I0320 08:35:36.951455 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:36 crc kubenswrapper[4903]: I0320 08:35:36.958561 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxt6c"] Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.121802 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-catalog-content\") pod \"certified-operators-sxt6c\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.122311 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jfrh\" (UniqueName: \"kubernetes.io/projected/1b2a2f17-38ed-4d79-8981-98defd4e1133-kube-api-access-4jfrh\") pod \"certified-operators-sxt6c\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.122381 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-utilities\") pod \"certified-operators-sxt6c\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.223763 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-utilities\") pod \"certified-operators-sxt6c\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.223845 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-catalog-content\") pod \"certified-operators-sxt6c\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.223889 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jfrh\" (UniqueName: \"kubernetes.io/projected/1b2a2f17-38ed-4d79-8981-98defd4e1133-kube-api-access-4jfrh\") pod \"certified-operators-sxt6c\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.224710 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-utilities\") pod \"certified-operators-sxt6c\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.224780 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-catalog-content\") pod \"certified-operators-sxt6c\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.254412 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jfrh\" (UniqueName: \"kubernetes.io/projected/1b2a2f17-38ed-4d79-8981-98defd4e1133-kube-api-access-4jfrh\") pod \"certified-operators-sxt6c\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.275208 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:37 crc kubenswrapper[4903]: I0320 08:35:37.538649 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxt6c"] Mar 20 08:35:38 crc kubenswrapper[4903]: I0320 08:35:38.134449 4903 generic.go:334] "Generic (PLEG): container finished" podID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerID="3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837" exitCode=0 Mar 20 08:35:38 crc kubenswrapper[4903]: I0320 08:35:38.134518 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt6c" event={"ID":"1b2a2f17-38ed-4d79-8981-98defd4e1133","Type":"ContainerDied","Data":"3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837"} Mar 20 08:35:38 crc kubenswrapper[4903]: I0320 08:35:38.134881 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt6c" event={"ID":"1b2a2f17-38ed-4d79-8981-98defd4e1133","Type":"ContainerStarted","Data":"15989df0f720a823a1ad08ff748ae728624ea8e4d3fc928f085f6000461e116c"} Mar 20 08:35:40 crc kubenswrapper[4903]: I0320 08:35:40.152677 4903 generic.go:334] "Generic (PLEG): container finished" podID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerID="103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7" exitCode=0 Mar 20 08:35:40 crc kubenswrapper[4903]: I0320 08:35:40.152767 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt6c" event={"ID":"1b2a2f17-38ed-4d79-8981-98defd4e1133","Type":"ContainerDied","Data":"103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7"} Mar 20 08:35:41 crc kubenswrapper[4903]: I0320 08:35:41.172460 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt6c" event={"ID":"1b2a2f17-38ed-4d79-8981-98defd4e1133","Type":"ContainerStarted","Data":"47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea"} Mar 20 08:35:41 crc kubenswrapper[4903]: I0320 08:35:41.204981 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sxt6c" podStartSLOduration=2.749875933 podStartE2EDuration="5.204954713s" podCreationTimestamp="2026-03-20 08:35:36 +0000 UTC" firstStartedPulling="2026-03-20 08:35:38.136947004 +0000 UTC m=+763.353847319" lastFinishedPulling="2026-03-20 08:35:40.592025744 +0000 UTC m=+765.808926099" observedRunningTime="2026-03-20 08:35:41.201172342 +0000 UTC m=+766.418072737" watchObservedRunningTime="2026-03-20 08:35:41.204954713 +0000 UTC m=+766.421855038" Mar 20 08:35:47 crc kubenswrapper[4903]: I0320 08:35:47.276754 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:47 crc kubenswrapper[4903]: I0320 08:35:47.277820 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:47 crc kubenswrapper[4903]: I0320 08:35:47.353079 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:48 crc kubenswrapper[4903]: I0320 08:35:48.261483 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:48 crc kubenswrapper[4903]: I0320 08:35:48.308055 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxt6c"] Mar 20 08:35:50 crc kubenswrapper[4903]: I0320 08:35:50.231979 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sxt6c" podUID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerName="registry-server" containerID="cri-o://47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea" gracePeriod=2 Mar 20 08:35:50 crc kubenswrapper[4903]: I0320 08:35:50.668164 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:50 crc kubenswrapper[4903]: I0320 08:35:50.845510 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-catalog-content\") pod \"1b2a2f17-38ed-4d79-8981-98defd4e1133\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " Mar 20 08:35:50 crc kubenswrapper[4903]: I0320 08:35:50.845709 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jfrh\" (UniqueName: \"kubernetes.io/projected/1b2a2f17-38ed-4d79-8981-98defd4e1133-kube-api-access-4jfrh\") pod \"1b2a2f17-38ed-4d79-8981-98defd4e1133\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " Mar 20 08:35:50 crc kubenswrapper[4903]: I0320 08:35:50.845755 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-utilities\") pod \"1b2a2f17-38ed-4d79-8981-98defd4e1133\" (UID: \"1b2a2f17-38ed-4d79-8981-98defd4e1133\") " Mar 20 08:35:50 crc kubenswrapper[4903]: I0320 08:35:50.847684 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-utilities" (OuterVolumeSpecName: "utilities") pod "1b2a2f17-38ed-4d79-8981-98defd4e1133" (UID: "1b2a2f17-38ed-4d79-8981-98defd4e1133"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:35:50 crc kubenswrapper[4903]: I0320 08:35:50.860606 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2a2f17-38ed-4d79-8981-98defd4e1133-kube-api-access-4jfrh" (OuterVolumeSpecName: "kube-api-access-4jfrh") pod "1b2a2f17-38ed-4d79-8981-98defd4e1133" (UID: "1b2a2f17-38ed-4d79-8981-98defd4e1133"). InnerVolumeSpecName "kube-api-access-4jfrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:35:50 crc kubenswrapper[4903]: I0320 08:35:50.948161 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jfrh\" (UniqueName: \"kubernetes.io/projected/1b2a2f17-38ed-4d79-8981-98defd4e1133-kube-api-access-4jfrh\") on node \"crc\" DevicePath \"\"" Mar 20 08:35:50 crc kubenswrapper[4903]: I0320 08:35:50.948849 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:35:50 crc kubenswrapper[4903]: I0320 08:35:50.981071 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b2a2f17-38ed-4d79-8981-98defd4e1133" (UID: "1b2a2f17-38ed-4d79-8981-98defd4e1133"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.050023 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2a2f17-38ed-4d79-8981-98defd4e1133-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.242236 4903 generic.go:334] "Generic (PLEG): container finished" podID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerID="47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea" exitCode=0 Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.242320 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt6c" event={"ID":"1b2a2f17-38ed-4d79-8981-98defd4e1133","Type":"ContainerDied","Data":"47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea"} Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.242380 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxt6c" event={"ID":"1b2a2f17-38ed-4d79-8981-98defd4e1133","Type":"ContainerDied","Data":"15989df0f720a823a1ad08ff748ae728624ea8e4d3fc928f085f6000461e116c"} Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.242423 4903 scope.go:117] "RemoveContainer" containerID="47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.242599 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxt6c" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.273368 4903 scope.go:117] "RemoveContainer" containerID="103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.284352 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxt6c"] Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.288711 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sxt6c"] Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.301860 4903 scope.go:117] "RemoveContainer" containerID="3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.318788 4903 scope.go:117] "RemoveContainer" containerID="47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea" Mar 20 08:35:51 crc kubenswrapper[4903]: E0320 08:35:51.319371 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea\": container with ID starting with 47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea not found: ID does not exist" containerID="47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.319401 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea"} err="failed to get container status \"47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea\": rpc error: code = NotFound desc = could not find container \"47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea\": container with ID starting with 47a3faa0e2745e47551fccd171c6da5f2aeada2ce6a9ab56c03c89df7bcfb7ea not found: ID does not exist" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.319423 4903 scope.go:117] "RemoveContainer" containerID="103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7" Mar 20 08:35:51 crc kubenswrapper[4903]: E0320 08:35:51.319877 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7\": container with ID starting with 103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7 not found: ID does not exist" containerID="103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.319944 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7"} err="failed to get container status \"103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7\": rpc error: code = NotFound desc = could not find container \"103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7\": container with ID starting with 103efdb9115faec4fb2695f489c582d554e24b5569cbd351b63c3b406ae1cdd7 not found: ID does not exist" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.319987 4903 scope.go:117] "RemoveContainer" containerID="3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837" Mar 20 08:35:51 crc kubenswrapper[4903]: E0320 08:35:51.320391 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837\": container with ID starting with 3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837 not found: ID does not exist" containerID="3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.320421 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837"} err="failed to get container status \"3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837\": rpc error: code = NotFound desc = could not find container \"3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837\": container with ID starting with 3edc0e9bf210204cf9042784a492fb243cd3327498de5a04e6521b2f65fa3837 not found: ID does not exist" Mar 20 08:35:51 crc kubenswrapper[4903]: I0320 08:35:51.498166 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2a2f17-38ed-4d79-8981-98defd4e1133" path="/var/lib/kubelet/pods/1b2a2f17-38ed-4d79-8981-98defd4e1133/volumes" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.145449 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566596-wdwc5"] Mar 20 08:36:00 crc kubenswrapper[4903]: E0320 08:36:00.146302 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerName="extract-utilities" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.146318 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerName="extract-utilities" Mar 20 08:36:00 crc kubenswrapper[4903]: E0320 08:36:00.146333 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerName="registry-server" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.146343 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerName="registry-server" Mar 20 08:36:00 crc kubenswrapper[4903]: E0320 08:36:00.146369 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerName="extract-content" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.146378 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerName="extract-content" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.146502 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2a2f17-38ed-4d79-8981-98defd4e1133" containerName="registry-server" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.146959 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566596-wdwc5" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.149059 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.149166 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.152286 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.153992 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566596-wdwc5"] Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.319405 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c66hh\" (UniqueName: \"kubernetes.io/projected/a45c6043-2cab-4100-9ba8-1942c427704c-kube-api-access-c66hh\") pod \"auto-csr-approver-29566596-wdwc5\" (UID: \"a45c6043-2cab-4100-9ba8-1942c427704c\") " pod="openshift-infra/auto-csr-approver-29566596-wdwc5" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.421105 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c66hh\" (UniqueName: \"kubernetes.io/projected/a45c6043-2cab-4100-9ba8-1942c427704c-kube-api-access-c66hh\") pod \"auto-csr-approver-29566596-wdwc5\" (UID: \"a45c6043-2cab-4100-9ba8-1942c427704c\") " pod="openshift-infra/auto-csr-approver-29566596-wdwc5" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.462529 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c66hh\" (UniqueName: \"kubernetes.io/projected/a45c6043-2cab-4100-9ba8-1942c427704c-kube-api-access-c66hh\") pod \"auto-csr-approver-29566596-wdwc5\" (UID: \"a45c6043-2cab-4100-9ba8-1942c427704c\") " pod="openshift-infra/auto-csr-approver-29566596-wdwc5" Mar 20 08:36:00 crc kubenswrapper[4903]: I0320 08:36:00.763387 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566596-wdwc5" Mar 20 08:36:01 crc kubenswrapper[4903]: I0320 08:36:01.005623 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566596-wdwc5"] Mar 20 08:36:01 crc kubenswrapper[4903]: I0320 08:36:01.317259 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566596-wdwc5" event={"ID":"a45c6043-2cab-4100-9ba8-1942c427704c","Type":"ContainerStarted","Data":"4c83be08a6c25d7c8550dad650d3e5fa73626f74212f6416133c017f2ed6a61d"} Mar 20 08:36:02 crc kubenswrapper[4903]: I0320 08:36:02.331466 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566596-wdwc5" event={"ID":"a45c6043-2cab-4100-9ba8-1942c427704c","Type":"ContainerStarted","Data":"6cc5f19e3cf9425022c6a178e54158ffb3d2d133ce12c751ac2a935b4699cd8a"} Mar 20 08:36:02 crc kubenswrapper[4903]: I0320 08:36:02.362068 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566596-wdwc5" podStartSLOduration=1.428892161 podStartE2EDuration="2.362006063s" podCreationTimestamp="2026-03-20 08:36:00 +0000 UTC" firstStartedPulling="2026-03-20 08:36:01.012273433 +0000 UTC m=+786.229173758" lastFinishedPulling="2026-03-20 08:36:01.945387305 +0000 UTC m=+787.162287660" observedRunningTime="2026-03-20 08:36:02.353860125 +0000 UTC m=+787.570760470" watchObservedRunningTime="2026-03-20 08:36:02.362006063 +0000 UTC m=+787.578906418" Mar 20 08:36:03 crc kubenswrapper[4903]: I0320 08:36:03.338920 4903 generic.go:334] "Generic (PLEG): container finished" podID="a45c6043-2cab-4100-9ba8-1942c427704c" containerID="6cc5f19e3cf9425022c6a178e54158ffb3d2d133ce12c751ac2a935b4699cd8a" exitCode=0 Mar 20 08:36:03 crc kubenswrapper[4903]: I0320 08:36:03.339013 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566596-wdwc5" event={"ID":"a45c6043-2cab-4100-9ba8-1942c427704c","Type":"ContainerDied","Data":"6cc5f19e3cf9425022c6a178e54158ffb3d2d133ce12c751ac2a935b4699cd8a"} Mar 20 08:36:04 crc kubenswrapper[4903]: I0320 08:36:04.609663 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566596-wdwc5" Mar 20 08:36:04 crc kubenswrapper[4903]: I0320 08:36:04.792246 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c66hh\" (UniqueName: \"kubernetes.io/projected/a45c6043-2cab-4100-9ba8-1942c427704c-kube-api-access-c66hh\") pod \"a45c6043-2cab-4100-9ba8-1942c427704c\" (UID: \"a45c6043-2cab-4100-9ba8-1942c427704c\") " Mar 20 08:36:04 crc kubenswrapper[4903]: I0320 08:36:04.802144 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45c6043-2cab-4100-9ba8-1942c427704c-kube-api-access-c66hh" (OuterVolumeSpecName: "kube-api-access-c66hh") pod "a45c6043-2cab-4100-9ba8-1942c427704c" (UID: "a45c6043-2cab-4100-9ba8-1942c427704c"). InnerVolumeSpecName "kube-api-access-c66hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:36:04 crc kubenswrapper[4903]: I0320 08:36:04.895467 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c66hh\" (UniqueName: \"kubernetes.io/projected/a45c6043-2cab-4100-9ba8-1942c427704c-kube-api-access-c66hh\") on node \"crc\" DevicePath \"\"" Mar 20 08:36:05 crc kubenswrapper[4903]: I0320 08:36:05.359934 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566596-wdwc5" event={"ID":"a45c6043-2cab-4100-9ba8-1942c427704c","Type":"ContainerDied","Data":"4c83be08a6c25d7c8550dad650d3e5fa73626f74212f6416133c017f2ed6a61d"} Mar 20 08:36:05 crc kubenswrapper[4903]: I0320 08:36:05.360374 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c83be08a6c25d7c8550dad650d3e5fa73626f74212f6416133c017f2ed6a61d" Mar 20 08:36:05 crc kubenswrapper[4903]: I0320 08:36:05.360136 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566596-wdwc5" Mar 20 08:36:05 crc kubenswrapper[4903]: I0320 08:36:05.432556 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566590-ws8nx"] Mar 20 08:36:05 crc kubenswrapper[4903]: I0320 08:36:05.436213 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566590-ws8nx"] Mar 20 08:36:05 crc kubenswrapper[4903]: I0320 08:36:05.500858 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fd43bc1-7eee-4b42-b9bf-75765d8f28b5" path="/var/lib/kubelet/pods/5fd43bc1-7eee-4b42-b9bf-75765d8f28b5/volumes" Mar 20 08:36:56 crc kubenswrapper[4903]: I0320 08:36:56.561413 4903 scope.go:117] "RemoveContainer" containerID="6bcc71e401a297d75b4f6b8b25efdf4d606f0e2200ac1e8e2b2a19909a6c66d9" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.089060 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5dkgq"] Mar 20 08:37:07 crc kubenswrapper[4903]: E0320 08:37:07.090656 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45c6043-2cab-4100-9ba8-1942c427704c" containerName="oc" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.090676 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45c6043-2cab-4100-9ba8-1942c427704c" containerName="oc" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.090972 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45c6043-2cab-4100-9ba8-1942c427704c" containerName="oc" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.093021 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.098530 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dkgq"] Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.262485 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-utilities\") pod \"redhat-marketplace-5dkgq\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.262576 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmgqh\" (UniqueName: \"kubernetes.io/projected/d878da59-76f0-4401-a6f4-50d6448bff24-kube-api-access-wmgqh\") pod \"redhat-marketplace-5dkgq\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.262676 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-catalog-content\") pod \"redhat-marketplace-5dkgq\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.363609 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-utilities\") pod \"redhat-marketplace-5dkgq\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.363708 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmgqh\" (UniqueName: \"kubernetes.io/projected/d878da59-76f0-4401-a6f4-50d6448bff24-kube-api-access-wmgqh\") pod \"redhat-marketplace-5dkgq\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.363789 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-catalog-content\") pod \"redhat-marketplace-5dkgq\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.364783 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-utilities\") pod \"redhat-marketplace-5dkgq\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.364827 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-catalog-content\") pod \"redhat-marketplace-5dkgq\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.396226 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmgqh\" (UniqueName: \"kubernetes.io/projected/d878da59-76f0-4401-a6f4-50d6448bff24-kube-api-access-wmgqh\") pod \"redhat-marketplace-5dkgq\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.472288 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:07 crc kubenswrapper[4903]: I0320 08:37:07.962257 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dkgq"] Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.254874 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m6k77"] Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.255851 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovn-controller" containerID="cri-o://fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0" gracePeriod=30 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.256316 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="sbdb" containerID="cri-o://1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a" gracePeriod=30 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.256372 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="nbdb" containerID="cri-o://7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3" gracePeriod=30 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.256411 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="northd" containerID="cri-o://10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea" gracePeriod=30 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.256451 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378" gracePeriod=30 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.256490 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="kube-rbac-proxy-node" containerID="cri-o://b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb" gracePeriod=30 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.256528 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovn-acl-logging" containerID="cri-o://691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10" gracePeriod=30 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.335941 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovnkube-controller" containerID="cri-o://6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996" gracePeriod=30 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.616960 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m6k77_157214e8-fbfe-4e9d-98f4-02680437b8b2/ovn-acl-logging/0.log" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.617951 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m6k77_157214e8-fbfe-4e9d-98f4-02680437b8b2/ovn-controller/0.log" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.618721 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.702016 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ljsfv"] Mar 20 08:37:08 crc kubenswrapper[4903]: E0320 08:37:08.702754 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="northd" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.702884 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="northd" Mar 20 08:37:08 crc kubenswrapper[4903]: E0320 08:37:08.702970 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovn-acl-logging" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.703063 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovn-acl-logging" Mar 20 08:37:08 crc kubenswrapper[4903]: E0320 08:37:08.703137 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="kube-rbac-proxy-node" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.703202 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="kube-rbac-proxy-node" Mar 20 08:37:08 crc kubenswrapper[4903]: E0320 08:37:08.703275 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovn-controller" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.703368 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovn-controller" Mar 20 08:37:08 crc kubenswrapper[4903]: E0320 08:37:08.703465 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="sbdb" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.703533 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="sbdb" Mar 20 08:37:08 crc kubenswrapper[4903]: E0320 08:37:08.703604 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="nbdb" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.703681 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="nbdb" Mar 20 08:37:08 crc kubenswrapper[4903]: E0320 08:37:08.703756 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovnkube-controller" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.703824 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovnkube-controller" Mar 20 08:37:08 crc kubenswrapper[4903]: E0320 08:37:08.703907 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.703972 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 08:37:08 crc kubenswrapper[4903]: E0320 08:37:08.704066 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="kubecfg-setup" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.704137 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="kubecfg-setup" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.704335 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovn-controller" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.704415 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovn-acl-logging" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.704499 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="kube-rbac-proxy-node" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.704570 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="sbdb" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.704635 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="ovnkube-controller" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.704705 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="northd" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.704783 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="nbdb" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.704858 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.707350 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.782759 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-env-overrides\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783260 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-ovn-kubernetes\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783297 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovn-node-metrics-cert\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783329 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-etc-openvswitch\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783381 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-netd\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783399 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-script-lib\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783393 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783420 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-netns\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783476 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783534 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-systemd-units\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783563 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-var-lib-openvswitch\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783594 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-bin\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783617 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-systemd\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783608 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783653 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-ovn\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783698 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-config\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783712 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783738 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-log-socket\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783752 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783764 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-kubelet\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783787 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783817 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783841 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-openvswitch\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783870 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-slash\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783907 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-node-log\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783943 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.783999 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z897\" (UniqueName: \"kubernetes.io/projected/157214e8-fbfe-4e9d-98f4-02680437b8b2-kube-api-access-9z897\") pod \"157214e8-fbfe-4e9d-98f4-02680437b8b2\" (UID: \"157214e8-fbfe-4e9d-98f4-02680437b8b2\") " Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784008 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-log-socket" (OuterVolumeSpecName: "log-socket") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784082 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784363 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784428 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784436 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784499 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-node-log" (OuterVolumeSpecName: "node-log") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784535 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-slash" (OuterVolumeSpecName: "host-slash") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784380 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784613 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784852 4903 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784928 4903 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.784983 4903 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785049 4903 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785118 4903 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785193 4903 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785256 4903 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785319 4903 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785378 4903 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785439 4903 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785499 4903 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785552 4903 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785627 4903 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785698 4903 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785756 4903 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785869 4903 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.785079 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.791889 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.793347 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157214e8-fbfe-4e9d-98f4-02680437b8b2-kube-api-access-9z897" (OuterVolumeSpecName: "kube-api-access-9z897") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "kube-api-access-9z897". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.812606 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "157214e8-fbfe-4e9d-98f4-02680437b8b2" (UID: "157214e8-fbfe-4e9d-98f4-02680437b8b2"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887132 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887223 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-run-openvswitch\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887247 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-cni-netd\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887288 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-systemd-units\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887313 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-etc-openvswitch\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887336 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-run-netns\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887385 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-run-ovn\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887459 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-run-systemd\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887494 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-run-ovn-kubernetes\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887542 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-log-socket\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887563 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3896626d-a3e8-440a-8adb-6330bfccef44-env-overrides\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887712 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3896626d-a3e8-440a-8adb-6330bfccef44-ovnkube-script-lib\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887863 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3896626d-a3e8-440a-8adb-6330bfccef44-ovn-node-metrics-cert\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887916 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqwc9\" (UniqueName: \"kubernetes.io/projected/3896626d-a3e8-440a-8adb-6330bfccef44-kube-api-access-bqwc9\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.887963 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-slash\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.888069 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-var-lib-openvswitch\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.888203 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-cni-bin\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.888268 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-kubelet\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.888346 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-node-log\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.888396 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3896626d-a3e8-440a-8adb-6330bfccef44-ovnkube-config\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.888582 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z897\" (UniqueName: \"kubernetes.io/projected/157214e8-fbfe-4e9d-98f4-02680437b8b2-kube-api-access-9z897\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.888615 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.888642 4903 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/157214e8-fbfe-4e9d-98f4-02680437b8b2-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.888663 4903 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/157214e8-fbfe-4e9d-98f4-02680437b8b2-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.898921 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m6k77_157214e8-fbfe-4e9d-98f4-02680437b8b2/ovn-acl-logging/0.log" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.899999 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m6k77_157214e8-fbfe-4e9d-98f4-02680437b8b2/ovn-controller/0.log" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900472 4903 generic.go:334] "Generic (PLEG): container finished" podID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerID="6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996" exitCode=0 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900511 4903 generic.go:334] "Generic (PLEG): container finished" podID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerID="1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a" exitCode=0 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900524 4903 generic.go:334] "Generic (PLEG): container finished" podID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerID="7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3" exitCode=0 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900533 4903 generic.go:334] "Generic (PLEG): container finished" podID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerID="10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea" exitCode=0 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900542 4903 generic.go:334] "Generic (PLEG): container finished" podID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerID="272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378" exitCode=0 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900552 4903 generic.go:334] "Generic (PLEG): container finished" podID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerID="b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb" exitCode=0 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900563 4903 generic.go:334] "Generic (PLEG): container finished" podID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerID="691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10" exitCode=143 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900572 4903 generic.go:334] "Generic (PLEG): container finished" podID="157214e8-fbfe-4e9d-98f4-02680437b8b2" containerID="fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0" exitCode=143 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900629 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerDied","Data":"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900685 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerDied","Data":"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900710 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerDied","Data":"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900722 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerDied","Data":"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900735 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerDied","Data":"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900748 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerDied","Data":"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900762 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900776 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900784 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900793 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerDied","Data":"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900819 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900827 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900835 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900843 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900850 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900858 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900866 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900873 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900881 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900890 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerDied","Data":"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900902 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900911 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900919 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900926 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900934 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900941 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900950 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900959 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900967 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900978 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" event={"ID":"157214e8-fbfe-4e9d-98f4-02680437b8b2","Type":"ContainerDied","Data":"00d95d19399b9744b191ea3eaf5558e95d28488a5dae61846ff9ca9ee4eaf4fc"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900989 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.900998 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.901005 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.901013 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.901020 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.901045 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.901053 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.901060 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.901068 4903 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.901086 4903 scope.go:117] "RemoveContainer" containerID="6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.901317 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m6k77" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.906249 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nzq6s_4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e/kube-multus/0.log" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.906320 4903 generic.go:334] "Generic (PLEG): container finished" podID="4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e" containerID="dc3018f573232eb4d02452fff1c433c9afac6c9c55d49063e1db014d696efef1" exitCode=2 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.906407 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nzq6s" event={"ID":"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e","Type":"ContainerDied","Data":"dc3018f573232eb4d02452fff1c433c9afac6c9c55d49063e1db014d696efef1"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.906873 4903 scope.go:117] "RemoveContainer" containerID="dc3018f573232eb4d02452fff1c433c9afac6c9c55d49063e1db014d696efef1" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.909019 4903 generic.go:334] "Generic (PLEG): container finished" podID="d878da59-76f0-4401-a6f4-50d6448bff24" containerID="1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0" exitCode=0 Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.909105 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dkgq" event={"ID":"d878da59-76f0-4401-a6f4-50d6448bff24","Type":"ContainerDied","Data":"1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.909171 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dkgq" event={"ID":"d878da59-76f0-4401-a6f4-50d6448bff24","Type":"ContainerStarted","Data":"11842dfc31e7d337ab9e9626da45ca61b258c125f22d9d84d023958a6b9a6f5b"} Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.912069 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.952026 4903 scope.go:117] "RemoveContainer" containerID="1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.977916 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m6k77"] Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.982715 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m6k77"] Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.984813 4903 scope.go:117] "RemoveContainer" containerID="7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990227 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-run-systemd\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990274 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-run-ovn-kubernetes\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990304 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-log-socket\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990325 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3896626d-a3e8-440a-8adb-6330bfccef44-env-overrides\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990347 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3896626d-a3e8-440a-8adb-6330bfccef44-ovnkube-script-lib\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990368 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3896626d-a3e8-440a-8adb-6330bfccef44-ovn-node-metrics-cert\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990388 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqwc9\" (UniqueName: \"kubernetes.io/projected/3896626d-a3e8-440a-8adb-6330bfccef44-kube-api-access-bqwc9\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990400 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-run-systemd\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990459 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-run-ovn-kubernetes\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990464 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-slash\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990406 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-slash\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990500 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-log-socket\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990555 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-var-lib-openvswitch\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990686 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-cni-bin\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990754 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-kubelet\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990835 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-node-log\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990882 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3896626d-a3e8-440a-8adb-6330bfccef44-ovnkube-config\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.990941 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991020 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-run-openvswitch\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991096 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-cni-netd\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991141 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-systemd-units\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991178 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-etc-openvswitch\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991221 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-run-netns\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991255 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-run-ovn\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991364 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-run-ovn\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991406 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-var-lib-openvswitch\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991435 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-cni-bin\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991465 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-kubelet\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991498 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-node-log\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.991614 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3896626d-a3e8-440a-8adb-6330bfccef44-env-overrides\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.992210 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3896626d-a3e8-440a-8adb-6330bfccef44-ovnkube-config\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.992271 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.992303 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-run-openvswitch\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.992331 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-cni-netd\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.992359 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-systemd-units\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.992387 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-etc-openvswitch\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.992418 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3896626d-a3e8-440a-8adb-6330bfccef44-host-run-netns\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.994372 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3896626d-a3e8-440a-8adb-6330bfccef44-ovnkube-script-lib\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:08 crc kubenswrapper[4903]: I0320 08:37:08.997494 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3896626d-a3e8-440a-8adb-6330bfccef44-ovn-node-metrics-cert\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.011347 4903 scope.go:117] "RemoveContainer" containerID="10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.012780 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqwc9\" (UniqueName: \"kubernetes.io/projected/3896626d-a3e8-440a-8adb-6330bfccef44-kube-api-access-bqwc9\") pod \"ovnkube-node-ljsfv\" (UID: \"3896626d-a3e8-440a-8adb-6330bfccef44\") " pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.022411 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.046670 4903 scope.go:117] "RemoveContainer" containerID="272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.064068 4903 scope.go:117] "RemoveContainer" containerID="b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.082392 4903 scope.go:117] "RemoveContainer" containerID="691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.101664 4903 scope.go:117] "RemoveContainer" containerID="fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.131956 4903 scope.go:117] "RemoveContainer" containerID="9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.175739 4903 scope.go:117] "RemoveContainer" containerID="6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996" Mar 20 08:37:09 crc kubenswrapper[4903]: E0320 08:37:09.176354 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": container with ID starting with 6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996 not found: ID does not exist" containerID="6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.176388 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996"} err="failed to get container status \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": rpc error: code = NotFound desc = could not find container \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": container with ID starting with 6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.176408 4903 scope.go:117] "RemoveContainer" containerID="1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a" Mar 20 08:37:09 crc kubenswrapper[4903]: E0320 08:37:09.176766 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": container with ID starting with 1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a not found: ID does not exist" containerID="1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.176792 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a"} err="failed to get container status \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": rpc error: code = NotFound desc = could not find container \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": container with ID starting with 1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.176806 4903 scope.go:117] "RemoveContainer" containerID="7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3" Mar 20 08:37:09 crc kubenswrapper[4903]: E0320 08:37:09.177314 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": container with ID starting with 7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3 not found: ID does not exist" containerID="7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.177336 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3"} err="failed to get container status \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": rpc error: code = NotFound desc = could not find container \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": container with ID starting with 7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.177349 4903 scope.go:117] "RemoveContainer" containerID="10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea" Mar 20 08:37:09 crc kubenswrapper[4903]: E0320 08:37:09.177588 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": container with ID starting with 10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea not found: ID does not exist" containerID="10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.177603 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea"} err="failed to get container status \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": rpc error: code = NotFound desc = could not find container \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": container with ID starting with 10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.177622 4903 scope.go:117] "RemoveContainer" containerID="272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378" Mar 20 08:37:09 crc kubenswrapper[4903]: E0320 08:37:09.177903 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": container with ID starting with 272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378 not found: ID does not exist" containerID="272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.177922 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378"} err="failed to get container status \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": rpc error: code = NotFound desc = could not find container \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": container with ID starting with 272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.177934 4903 scope.go:117] "RemoveContainer" containerID="b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb" Mar 20 08:37:09 crc kubenswrapper[4903]: E0320 08:37:09.178153 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": container with ID starting with b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb not found: ID does not exist" containerID="b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.178172 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb"} err="failed to get container status \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": rpc error: code = NotFound desc = could not find container \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": container with ID starting with b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.178184 4903 scope.go:117] "RemoveContainer" containerID="691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10" Mar 20 08:37:09 crc kubenswrapper[4903]: E0320 08:37:09.178403 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10\": container with ID starting with 691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10 not found: ID does not exist" containerID="691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.178421 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10"} err="failed to get container status \"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10\": rpc error: code = NotFound desc = could not find container \"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10\": container with ID starting with 691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.178434 4903 scope.go:117] "RemoveContainer" containerID="fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0" Mar 20 08:37:09 crc kubenswrapper[4903]: E0320 08:37:09.178670 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0\": container with ID starting with fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0 not found: ID does not exist" containerID="fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.178690 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0"} err="failed to get container status \"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0\": rpc error: code = NotFound desc = could not find container \"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0\": container with ID starting with fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.178707 4903 scope.go:117] "RemoveContainer" containerID="9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362" Mar 20 08:37:09 crc kubenswrapper[4903]: E0320 08:37:09.178973 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\": container with ID starting with 9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362 not found: ID does not exist" containerID="9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.178991 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362"} err="failed to get container status \"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\": rpc error: code = NotFound desc = could not find container \"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\": container with ID starting with 9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.179003 4903 scope.go:117] "RemoveContainer" containerID="6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.179320 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996"} err="failed to get container status \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": rpc error: code = NotFound desc = could not find container \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": container with ID starting with 6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.179337 4903 scope.go:117] "RemoveContainer" containerID="1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.180465 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a"} err="failed to get container status \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": rpc error: code = NotFound desc = could not find container \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": container with ID starting with 1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.180486 4903 scope.go:117] "RemoveContainer" containerID="7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.180823 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3"} err="failed to get container status \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": rpc error: code = NotFound desc = could not find container \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": container with ID starting with 7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.180839 4903 scope.go:117] "RemoveContainer" containerID="10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.181068 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea"} err="failed to get container status \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": rpc error: code = NotFound desc = could not find container \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": container with ID starting with 10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.181107 4903 scope.go:117] "RemoveContainer" containerID="272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.181322 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378"} err="failed to get container status \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": rpc error: code = NotFound desc = could not find container \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": container with ID starting with 272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.181338 4903 scope.go:117] "RemoveContainer" containerID="b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.181496 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb"} err="failed to get container status \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": rpc error: code = NotFound desc = could not find container \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": container with ID starting with b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.181513 4903 scope.go:117] "RemoveContainer" containerID="691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.181875 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10"} err="failed to get container status \"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10\": rpc error: code = NotFound desc = could not find container \"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10\": container with ID starting with 691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.181898 4903 scope.go:117] "RemoveContainer" containerID="fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182091 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0"} err="failed to get container status \"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0\": rpc error: code = NotFound desc = could not find container \"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0\": container with ID starting with fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182107 4903 scope.go:117] "RemoveContainer" containerID="9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182269 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362"} err="failed to get container status \"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\": rpc error: code = NotFound desc = could not find container \"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\": container with ID starting with 9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182287 4903 scope.go:117] "RemoveContainer" containerID="6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182455 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996"} err="failed to get container status \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": rpc error: code = NotFound desc = could not find container \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": container with ID starting with 6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182470 4903 scope.go:117] "RemoveContainer" containerID="1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182630 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a"} err="failed to get container status \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": rpc error: code = NotFound desc = could not find container \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": container with ID starting with 1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182647 4903 scope.go:117] "RemoveContainer" containerID="7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182808 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3"} err="failed to get container status \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": rpc error: code = NotFound desc = could not find container \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": container with ID starting with 7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182824 4903 scope.go:117] "RemoveContainer" containerID="10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.182997 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea"} err="failed to get container status \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": rpc error: code = NotFound desc = could not find container \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": container with ID starting with 10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183012 4903 scope.go:117] "RemoveContainer" containerID="272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183201 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378"} err="failed to get container status \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": rpc error: code = NotFound desc = could not find container \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": container with ID starting with 272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183218 4903 scope.go:117] "RemoveContainer" containerID="b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183371 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb"} err="failed to get container status \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": rpc error: code = NotFound desc = could not find container \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": container with ID starting with b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183387 4903 scope.go:117] "RemoveContainer" containerID="691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183571 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10"} err="failed to get container status \"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10\": rpc error: code = NotFound desc = could not find container \"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10\": container with ID starting with 691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183591 4903 scope.go:117] "RemoveContainer" containerID="fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183757 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0"} err="failed to get container status \"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0\": rpc error: code = NotFound desc = could not find container \"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0\": container with ID starting with fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183774 4903 scope.go:117] "RemoveContainer" containerID="9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183923 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362"} err="failed to get container status \"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\": rpc error: code = NotFound desc = could not find container \"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\": container with ID starting with 9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.183939 4903 scope.go:117] "RemoveContainer" containerID="6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.184110 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996"} err="failed to get container status \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": rpc error: code = NotFound desc = could not find container \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": container with ID starting with 6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.184125 4903 scope.go:117] "RemoveContainer" containerID="1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.185567 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a"} err="failed to get container status \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": rpc error: code = NotFound desc = could not find container \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": container with ID starting with 1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.185585 4903 scope.go:117] "RemoveContainer" containerID="7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.186018 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3"} err="failed to get container status \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": rpc error: code = NotFound desc = could not find container \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": container with ID starting with 7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.186055 4903 scope.go:117] "RemoveContainer" containerID="10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.186368 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea"} err="failed to get container status \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": rpc error: code = NotFound desc = could not find container \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": container with ID starting with 10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.186384 4903 scope.go:117] "RemoveContainer" containerID="272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.186655 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378"} err="failed to get container status \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": rpc error: code = NotFound desc = could not find container \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": container with ID starting with 272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.186671 4903 scope.go:117] "RemoveContainer" containerID="b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.186903 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb"} err="failed to get container status \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": rpc error: code = NotFound desc = could not find container \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": container with ID starting with b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.186920 4903 scope.go:117] "RemoveContainer" containerID="691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.187190 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10"} err="failed to get container status \"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10\": rpc error: code = NotFound desc = could not find container \"691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10\": container with ID starting with 691b8b593f7e45dee796765204e19ab9fe42adc904e33c692fb316dc994aea10 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.187207 4903 scope.go:117] "RemoveContainer" containerID="fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.187439 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0"} err="failed to get container status \"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0\": rpc error: code = NotFound desc = could not find container \"fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0\": container with ID starting with fe3d869acf63a0250c1b26e36a04855dc08643394a8d64c9d4c82054f3f400a0 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.187458 4903 scope.go:117] "RemoveContainer" containerID="9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.187659 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362"} err="failed to get container status \"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\": rpc error: code = NotFound desc = could not find container \"9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362\": container with ID starting with 9f370eaf015ef59b0dc8b779f3532cb49e20dd28f4e8c95f7276722b4ed82362 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.187726 4903 scope.go:117] "RemoveContainer" containerID="6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.188686 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996"} err="failed to get container status \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": rpc error: code = NotFound desc = could not find container \"6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996\": container with ID starting with 6327470934a7b6549e52613fa520fb09d780a1db12171a46876a07a1db70a996 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.188708 4903 scope.go:117] "RemoveContainer" containerID="1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.188974 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a"} err="failed to get container status \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": rpc error: code = NotFound desc = could not find container \"1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a\": container with ID starting with 1003de0029b8b9db6cb94237a4b5ced815916ebff5f7727dd069073c68d64d9a not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.188990 4903 scope.go:117] "RemoveContainer" containerID="7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.189360 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3"} err="failed to get container status \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": rpc error: code = NotFound desc = could not find container \"7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3\": container with ID starting with 7dc59852ff7f645c4eb5a214e361ac06060b83944e544934e183ea37408f3fc3 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.189380 4903 scope.go:117] "RemoveContainer" containerID="10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.189607 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea"} err="failed to get container status \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": rpc error: code = NotFound desc = could not find container \"10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea\": container with ID starting with 10b64cc58e0fdc4a422a2afa460d435875a10e372dc8eb7834aec769450b10ea not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.189623 4903 scope.go:117] "RemoveContainer" containerID="272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.189837 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378"} err="failed to get container status \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": rpc error: code = NotFound desc = could not find container \"272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378\": container with ID starting with 272edfbc487b1d01b2ca4f33197ab2c05e32fdb2f60e683ce8ff908f05fff378 not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.189854 4903 scope.go:117] "RemoveContainer" containerID="b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.190214 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb"} err="failed to get container status \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": rpc error: code = NotFound desc = could not find container \"b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb\": container with ID starting with b2afe450471abfa96b8c77a0e261c47f541f330d529edd480e109298763ff7bb not found: ID does not exist" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.512028 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="157214e8-fbfe-4e9d-98f4-02680437b8b2" path="/var/lib/kubelet/pods/157214e8-fbfe-4e9d-98f4-02680437b8b2/volumes" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.918353 4903 generic.go:334] "Generic (PLEG): container finished" podID="3896626d-a3e8-440a-8adb-6330bfccef44" containerID="8827e9fbe7f6d7acdadf3172a4223a65db963c5e915b904dfd37e4df354b25b5" exitCode=0 Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.918446 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" event={"ID":"3896626d-a3e8-440a-8adb-6330bfccef44","Type":"ContainerDied","Data":"8827e9fbe7f6d7acdadf3172a4223a65db963c5e915b904dfd37e4df354b25b5"} Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.918782 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" event={"ID":"3896626d-a3e8-440a-8adb-6330bfccef44","Type":"ContainerStarted","Data":"87d01a6e28683994f3a3f8067835727fd0f01da39ee209993ab89ab094bea956"} Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.921781 4903 generic.go:334] "Generic (PLEG): container finished" podID="d878da59-76f0-4401-a6f4-50d6448bff24" containerID="4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95" exitCode=0 Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.921847 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dkgq" event={"ID":"d878da59-76f0-4401-a6f4-50d6448bff24","Type":"ContainerDied","Data":"4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95"} Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.931525 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nzq6s_4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e/kube-multus/0.log" Mar 20 08:37:09 crc kubenswrapper[4903]: I0320 08:37:09.931580 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nzq6s" event={"ID":"4f2f8d10-bf3a-48a4-9e71-2d3b5dc2743e","Type":"ContainerStarted","Data":"0356f987ec7230f2bacad08e916f89c7541d954ab2f6e644e2b069d53960de3f"} Mar 20 08:37:10 crc kubenswrapper[4903]: I0320 08:37:10.940528 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" event={"ID":"3896626d-a3e8-440a-8adb-6330bfccef44","Type":"ContainerStarted","Data":"d79ca506e5b6bf405c0b413489eb17e565318a9c0117ca1f9f0f8c9e1a275778"} Mar 20 08:37:10 crc kubenswrapper[4903]: I0320 08:37:10.941437 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" event={"ID":"3896626d-a3e8-440a-8adb-6330bfccef44","Type":"ContainerStarted","Data":"ea07add2065b08967448bd019443d8c77ea26cdf5598d06e7b2edc8a304813f5"} Mar 20 08:37:10 crc kubenswrapper[4903]: I0320 08:37:10.941450 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" event={"ID":"3896626d-a3e8-440a-8adb-6330bfccef44","Type":"ContainerStarted","Data":"ab162e562283fc3b5852389d590b532654b50e4a23db1cb0cceeb50b28d6d97c"} Mar 20 08:37:10 crc kubenswrapper[4903]: I0320 08:37:10.941462 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" event={"ID":"3896626d-a3e8-440a-8adb-6330bfccef44","Type":"ContainerStarted","Data":"a9eaeff621cce694b5ce724766a670cec5c2474ffe717c5637a9bd57ad693739"} Mar 20 08:37:10 crc kubenswrapper[4903]: I0320 08:37:10.941475 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" event={"ID":"3896626d-a3e8-440a-8adb-6330bfccef44","Type":"ContainerStarted","Data":"eebfb9c7b082bd1e3ff44cf6d5cf2d39065ac142563a2e876bd35854ea1bff4d"} Mar 20 08:37:10 crc kubenswrapper[4903]: I0320 08:37:10.941485 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" event={"ID":"3896626d-a3e8-440a-8adb-6330bfccef44","Type":"ContainerStarted","Data":"432907291b2991446bd4c30364752998d91b1abde30f8ed310a402aad12297a4"} Mar 20 08:37:10 crc kubenswrapper[4903]: I0320 08:37:10.944439 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dkgq" event={"ID":"d878da59-76f0-4401-a6f4-50d6448bff24","Type":"ContainerStarted","Data":"489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95"} Mar 20 08:37:10 crc kubenswrapper[4903]: I0320 08:37:10.968497 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5dkgq" podStartSLOduration=2.439146545 podStartE2EDuration="3.968469124s" podCreationTimestamp="2026-03-20 08:37:07 +0000 UTC" firstStartedPulling="2026-03-20 08:37:08.91160814 +0000 UTC m=+854.128508465" lastFinishedPulling="2026-03-20 08:37:10.440930729 +0000 UTC m=+855.657831044" observedRunningTime="2026-03-20 08:37:10.964696808 +0000 UTC m=+856.181597123" watchObservedRunningTime="2026-03-20 08:37:10.968469124 +0000 UTC m=+856.185369439" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.442459 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p94lk"] Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.443613 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.550027 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xrgr\" (UniqueName: \"kubernetes.io/projected/6fe8ee60-f750-4170-932f-7dbf96f643e8-kube-api-access-9xrgr\") pod \"redhat-operators-p94lk\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.550593 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-utilities\") pod \"redhat-operators-p94lk\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.550616 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-catalog-content\") pod \"redhat-operators-p94lk\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.652264 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xrgr\" (UniqueName: \"kubernetes.io/projected/6fe8ee60-f750-4170-932f-7dbf96f643e8-kube-api-access-9xrgr\") pod \"redhat-operators-p94lk\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.652388 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-utilities\") pod \"redhat-operators-p94lk\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.652412 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-catalog-content\") pod \"redhat-operators-p94lk\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.653729 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-catalog-content\") pod \"redhat-operators-p94lk\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.653784 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-utilities\") pod \"redhat-operators-p94lk\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.696321 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xrgr\" (UniqueName: \"kubernetes.io/projected/6fe8ee60-f750-4170-932f-7dbf96f643e8-kube-api-access-9xrgr\") pod \"redhat-operators-p94lk\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: I0320 08:37:12.817878 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: E0320 08:37:12.865389 4903 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-p94lk_openshift-marketplace_6fe8ee60-f750-4170-932f-7dbf96f643e8_0(a175994ea1bbdae93b3c7542ab92aa9f2d35e64385c9389673000f3da3d145a0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:37:12 crc kubenswrapper[4903]: E0320 08:37:12.865475 4903 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-p94lk_openshift-marketplace_6fe8ee60-f750-4170-932f-7dbf96f643e8_0(a175994ea1bbdae93b3c7542ab92aa9f2d35e64385c9389673000f3da3d145a0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: E0320 08:37:12.865505 4903 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-p94lk_openshift-marketplace_6fe8ee60-f750-4170-932f-7dbf96f643e8_0(a175994ea1bbdae93b3c7542ab92aa9f2d35e64385c9389673000f3da3d145a0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:12 crc kubenswrapper[4903]: E0320 08:37:12.865565 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-operators-p94lk_openshift-marketplace(6fe8ee60-f750-4170-932f-7dbf96f643e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-operators-p94lk_openshift-marketplace(6fe8ee60-f750-4170-932f-7dbf96f643e8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-p94lk_openshift-marketplace_6fe8ee60-f750-4170-932f-7dbf96f643e8_0(a175994ea1bbdae93b3c7542ab92aa9f2d35e64385c9389673000f3da3d145a0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/redhat-operators-p94lk" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" Mar 20 08:37:13 crc kubenswrapper[4903]: I0320 08:37:13.967353 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" event={"ID":"3896626d-a3e8-440a-8adb-6330bfccef44","Type":"ContainerStarted","Data":"5a2ee7ecdc384bed08d63ed599914f27fdb956a7b5179c29d364d5b75b371bea"} Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.638644 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w5jv2"] Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.640652 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.683413 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-utilities\") pod \"community-operators-w5jv2\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.684145 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdth4\" (UniqueName: \"kubernetes.io/projected/34abeb10-0f26-4b84-8c5d-1867ab464452-kube-api-access-zdth4\") pod \"community-operators-w5jv2\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.684407 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-catalog-content\") pod \"community-operators-w5jv2\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.785806 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-utilities\") pod \"community-operators-w5jv2\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.785905 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdth4\" (UniqueName: \"kubernetes.io/projected/34abeb10-0f26-4b84-8c5d-1867ab464452-kube-api-access-zdth4\") pod \"community-operators-w5jv2\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.785935 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-catalog-content\") pod \"community-operators-w5jv2\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.786504 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-catalog-content\") pod \"community-operators-w5jv2\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.787267 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-utilities\") pod \"community-operators-w5jv2\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.824120 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdth4\" (UniqueName: \"kubernetes.io/projected/34abeb10-0f26-4b84-8c5d-1867ab464452-kube-api-access-zdth4\") pod \"community-operators-w5jv2\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:14 crc kubenswrapper[4903]: I0320 08:37:14.963699 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:15 crc kubenswrapper[4903]: E0320 08:37:15.008069 4903 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-w5jv2_openshift-marketplace_34abeb10-0f26-4b84-8c5d-1867ab464452_0(93e16e22bd0e136e7bd32e04d324a8a5c1900711b9ce111a9453f288bb3d9093): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:37:15 crc kubenswrapper[4903]: E0320 08:37:15.008158 4903 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-w5jv2_openshift-marketplace_34abeb10-0f26-4b84-8c5d-1867ab464452_0(93e16e22bd0e136e7bd32e04d324a8a5c1900711b9ce111a9453f288bb3d9093): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:15 crc kubenswrapper[4903]: E0320 08:37:15.008189 4903 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-w5jv2_openshift-marketplace_34abeb10-0f26-4b84-8c5d-1867ab464452_0(93e16e22bd0e136e7bd32e04d324a8a5c1900711b9ce111a9453f288bb3d9093): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:15 crc kubenswrapper[4903]: E0320 08:37:15.008250 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-w5jv2_openshift-marketplace(34abeb10-0f26-4b84-8c5d-1867ab464452)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-w5jv2_openshift-marketplace(34abeb10-0f26-4b84-8c5d-1867ab464452)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-w5jv2_openshift-marketplace_34abeb10-0f26-4b84-8c5d-1867ab464452_0(93e16e22bd0e136e7bd32e04d324a8a5c1900711b9ce111a9453f288bb3d9093): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/community-operators-w5jv2" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.548897 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" event={"ID":"3896626d-a3e8-440a-8adb-6330bfccef44","Type":"ContainerStarted","Data":"49392c12b892064f2a8c2e4da20ebad3f57d85953fc9997747e02dc88780b9d7"} Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.549430 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.566589 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ndjr5"] Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.567438 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.570212 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.570249 4903 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-bpmcw" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.570509 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.570623 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.610829 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" podStartSLOduration=8.610808772 podStartE2EDuration="8.610808772s" podCreationTimestamp="2026-03-20 08:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:37:16.605211941 +0000 UTC m=+861.822112266" watchObservedRunningTime="2026-03-20 08:37:16.610808772 +0000 UTC m=+861.827709107" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.614387 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.719864 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g78f7\" (UniqueName: \"kubernetes.io/projected/6cc44780-3165-4335-9a85-4435d9b63ba1-kube-api-access-g78f7\") pod \"crc-storage-crc-ndjr5\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.720120 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6cc44780-3165-4335-9a85-4435d9b63ba1-node-mnt\") pod \"crc-storage-crc-ndjr5\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.720273 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6cc44780-3165-4335-9a85-4435d9b63ba1-crc-storage\") pod \"crc-storage-crc-ndjr5\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.822012 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6cc44780-3165-4335-9a85-4435d9b63ba1-crc-storage\") pod \"crc-storage-crc-ndjr5\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.822252 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g78f7\" (UniqueName: \"kubernetes.io/projected/6cc44780-3165-4335-9a85-4435d9b63ba1-kube-api-access-g78f7\") pod \"crc-storage-crc-ndjr5\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.822320 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6cc44780-3165-4335-9a85-4435d9b63ba1-node-mnt\") pod \"crc-storage-crc-ndjr5\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.822766 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6cc44780-3165-4335-9a85-4435d9b63ba1-node-mnt\") pod \"crc-storage-crc-ndjr5\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.823318 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6cc44780-3165-4335-9a85-4435d9b63ba1-crc-storage\") pod \"crc-storage-crc-ndjr5\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.862333 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g78f7\" (UniqueName: \"kubernetes.io/projected/6cc44780-3165-4335-9a85-4435d9b63ba1-kube-api-access-g78f7\") pod \"crc-storage-crc-ndjr5\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: I0320 08:37:16.895735 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: E0320 08:37:16.941475 4903 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ndjr5_crc-storage_6cc44780-3165-4335-9a85-4435d9b63ba1_0(1bde693b5d5bb97d9334320da8cc11df75eb36fbe0dfe8b5972c2b5d130b6e4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:37:16 crc kubenswrapper[4903]: E0320 08:37:16.942089 4903 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ndjr5_crc-storage_6cc44780-3165-4335-9a85-4435d9b63ba1_0(1bde693b5d5bb97d9334320da8cc11df75eb36fbe0dfe8b5972c2b5d130b6e4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: E0320 08:37:16.942198 4903 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ndjr5_crc-storage_6cc44780-3165-4335-9a85-4435d9b63ba1_0(1bde693b5d5bb97d9334320da8cc11df75eb36fbe0dfe8b5972c2b5d130b6e4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:16 crc kubenswrapper[4903]: E0320 08:37:16.942349 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-ndjr5_crc-storage(6cc44780-3165-4335-9a85-4435d9b63ba1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-ndjr5_crc-storage(6cc44780-3165-4335-9a85-4435d9b63ba1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ndjr5_crc-storage_6cc44780-3165-4335-9a85-4435d9b63ba1_0(1bde693b5d5bb97d9334320da8cc11df75eb36fbe0dfe8b5972c2b5d130b6e4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-ndjr5" podUID="6cc44780-3165-4335-9a85-4435d9b63ba1" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.473294 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.473391 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.529366 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.555539 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.555588 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.583411 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.600719 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.970413 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w5jv2"] Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.970579 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.971166 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.989277 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p94lk"] Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.989447 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.989975 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.998528 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ndjr5"] Mar 20 08:37:17 crc kubenswrapper[4903]: I0320 08:37:17.998643 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.009221 4903 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-w5jv2_openshift-marketplace_34abeb10-0f26-4b84-8c5d-1867ab464452_0(521b39af41834593333af46f944934a30621a0df91aeae31e418f1d07ed48826): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.009299 4903 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-w5jv2_openshift-marketplace_34abeb10-0f26-4b84-8c5d-1867ab464452_0(521b39af41834593333af46f944934a30621a0df91aeae31e418f1d07ed48826): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.009327 4903 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-w5jv2_openshift-marketplace_34abeb10-0f26-4b84-8c5d-1867ab464452_0(521b39af41834593333af46f944934a30621a0df91aeae31e418f1d07ed48826): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.009382 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-w5jv2_openshift-marketplace(34abeb10-0f26-4b84-8c5d-1867ab464452)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-w5jv2_openshift-marketplace(34abeb10-0f26-4b84-8c5d-1867ab464452)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-w5jv2_openshift-marketplace_34abeb10-0f26-4b84-8c5d-1867ab464452_0(521b39af41834593333af46f944934a30621a0df91aeae31e418f1d07ed48826): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/community-operators-w5jv2" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" Mar 20 08:37:18 crc kubenswrapper[4903]: I0320 08:37:18.010672 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.018303 4903 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-p94lk_openshift-marketplace_6fe8ee60-f750-4170-932f-7dbf96f643e8_0(321403feb327288a32d299e52b1fc9ffc8552e97a0b989ce497592431eb2becc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.018368 4903 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-p94lk_openshift-marketplace_6fe8ee60-f750-4170-932f-7dbf96f643e8_0(321403feb327288a32d299e52b1fc9ffc8552e97a0b989ce497592431eb2becc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.018395 4903 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-p94lk_openshift-marketplace_6fe8ee60-f750-4170-932f-7dbf96f643e8_0(321403feb327288a32d299e52b1fc9ffc8552e97a0b989ce497592431eb2becc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.018453 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-operators-p94lk_openshift-marketplace(6fe8ee60-f750-4170-932f-7dbf96f643e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-operators-p94lk_openshift-marketplace(6fe8ee60-f750-4170-932f-7dbf96f643e8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-p94lk_openshift-marketplace_6fe8ee60-f750-4170-932f-7dbf96f643e8_0(321403feb327288a32d299e52b1fc9ffc8552e97a0b989ce497592431eb2becc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/redhat-operators-p94lk" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.097891 4903 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ndjr5_crc-storage_6cc44780-3165-4335-9a85-4435d9b63ba1_0(ad086a427901e28984d19b70009528353a0a1557210f9ba14270d8a8776529c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.098343 4903 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ndjr5_crc-storage_6cc44780-3165-4335-9a85-4435d9b63ba1_0(ad086a427901e28984d19b70009528353a0a1557210f9ba14270d8a8776529c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.098373 4903 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ndjr5_crc-storage_6cc44780-3165-4335-9a85-4435d9b63ba1_0(ad086a427901e28984d19b70009528353a0a1557210f9ba14270d8a8776529c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:18 crc kubenswrapper[4903]: E0320 08:37:18.098426 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-ndjr5_crc-storage(6cc44780-3165-4335-9a85-4435d9b63ba1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-ndjr5_crc-storage(6cc44780-3165-4335-9a85-4435d9b63ba1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ndjr5_crc-storage_6cc44780-3165-4335-9a85-4435d9b63ba1_0(ad086a427901e28984d19b70009528353a0a1557210f9ba14270d8a8776529c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-ndjr5" podUID="6cc44780-3165-4335-9a85-4435d9b63ba1" Mar 20 08:37:19 crc kubenswrapper[4903]: I0320 08:37:19.631022 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dkgq"] Mar 20 08:37:19 crc kubenswrapper[4903]: I0320 08:37:19.631376 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5dkgq" podUID="d878da59-76f0-4401-a6f4-50d6448bff24" containerName="registry-server" containerID="cri-o://489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95" gracePeriod=2 Mar 20 08:37:19 crc kubenswrapper[4903]: I0320 08:37:19.844076 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:19 crc kubenswrapper[4903]: I0320 08:37:19.967201 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-utilities\") pod \"d878da59-76f0-4401-a6f4-50d6448bff24\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " Mar 20 08:37:19 crc kubenswrapper[4903]: I0320 08:37:19.967434 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmgqh\" (UniqueName: \"kubernetes.io/projected/d878da59-76f0-4401-a6f4-50d6448bff24-kube-api-access-wmgqh\") pod \"d878da59-76f0-4401-a6f4-50d6448bff24\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " Mar 20 08:37:19 crc kubenswrapper[4903]: I0320 08:37:19.967492 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-catalog-content\") pod \"d878da59-76f0-4401-a6f4-50d6448bff24\" (UID: \"d878da59-76f0-4401-a6f4-50d6448bff24\") " Mar 20 08:37:19 crc kubenswrapper[4903]: I0320 08:37:19.968829 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-utilities" (OuterVolumeSpecName: "utilities") pod "d878da59-76f0-4401-a6f4-50d6448bff24" (UID: "d878da59-76f0-4401-a6f4-50d6448bff24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:37:19 crc kubenswrapper[4903]: I0320 08:37:19.977494 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d878da59-76f0-4401-a6f4-50d6448bff24-kube-api-access-wmgqh" (OuterVolumeSpecName: "kube-api-access-wmgqh") pod "d878da59-76f0-4401-a6f4-50d6448bff24" (UID: "d878da59-76f0-4401-a6f4-50d6448bff24"). InnerVolumeSpecName "kube-api-access-wmgqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.004540 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d878da59-76f0-4401-a6f4-50d6448bff24" (UID: "d878da59-76f0-4401-a6f4-50d6448bff24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.068829 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.068867 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d878da59-76f0-4401-a6f4-50d6448bff24-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.068881 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmgqh\" (UniqueName: \"kubernetes.io/projected/d878da59-76f0-4401-a6f4-50d6448bff24-kube-api-access-wmgqh\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.582909 4903 generic.go:334] "Generic (PLEG): container finished" podID="d878da59-76f0-4401-a6f4-50d6448bff24" containerID="489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95" exitCode=0 Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.582987 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dkgq" event={"ID":"d878da59-76f0-4401-a6f4-50d6448bff24","Type":"ContainerDied","Data":"489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95"} Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.583305 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dkgq" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.583026 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dkgq" event={"ID":"d878da59-76f0-4401-a6f4-50d6448bff24","Type":"ContainerDied","Data":"11842dfc31e7d337ab9e9626da45ca61b258c125f22d9d84d023958a6b9a6f5b"} Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.583488 4903 scope.go:117] "RemoveContainer" containerID="489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.617631 4903 scope.go:117] "RemoveContainer" containerID="4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.646021 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dkgq"] Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.649393 4903 scope.go:117] "RemoveContainer" containerID="1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.654460 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dkgq"] Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.672451 4903 scope.go:117] "RemoveContainer" containerID="489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95" Mar 20 08:37:20 crc kubenswrapper[4903]: E0320 08:37:20.672975 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95\": container with ID starting with 489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95 not found: ID does not exist" containerID="489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.673186 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95"} err="failed to get container status \"489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95\": rpc error: code = NotFound desc = could not find container \"489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95\": container with ID starting with 489db84a2197c9f18e8fb1d66747f479f5da926a7eddcaba5ede15f872e50a95 not found: ID does not exist" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.673320 4903 scope.go:117] "RemoveContainer" containerID="4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95" Mar 20 08:37:20 crc kubenswrapper[4903]: E0320 08:37:20.673742 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95\": container with ID starting with 4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95 not found: ID does not exist" containerID="4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.673780 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95"} err="failed to get container status \"4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95\": rpc error: code = NotFound desc = could not find container \"4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95\": container with ID starting with 4f9bfa6e345b28186acd92b50361ec579547e4cdbabd670cad54bdb087cccd95 not found: ID does not exist" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.673806 4903 scope.go:117] "RemoveContainer" containerID="1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0" Mar 20 08:37:20 crc kubenswrapper[4903]: E0320 08:37:20.674081 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0\": container with ID starting with 1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0 not found: ID does not exist" containerID="1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0" Mar 20 08:37:20 crc kubenswrapper[4903]: I0320 08:37:20.674209 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0"} err="failed to get container status \"1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0\": rpc error: code = NotFound desc = could not find container \"1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0\": container with ID starting with 1530cc22c00a95bbbec3a8a8406509740cea1743d34236fd7138d980304f56e0 not found: ID does not exist" Mar 20 08:37:21 crc kubenswrapper[4903]: I0320 08:37:21.502483 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d878da59-76f0-4401-a6f4-50d6448bff24" path="/var/lib/kubelet/pods/d878da59-76f0-4401-a6f4-50d6448bff24/volumes" Mar 20 08:37:28 crc kubenswrapper[4903]: I0320 08:37:28.490569 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:28 crc kubenswrapper[4903]: I0320 08:37:28.492257 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:29 crc kubenswrapper[4903]: I0320 08:37:29.044297 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p94lk"] Mar 20 08:37:29 crc kubenswrapper[4903]: I0320 08:37:29.490463 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:29 crc kubenswrapper[4903]: I0320 08:37:29.490789 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:29 crc kubenswrapper[4903]: I0320 08:37:29.491704 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:29 crc kubenswrapper[4903]: I0320 08:37:29.491774 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:29 crc kubenswrapper[4903]: I0320 08:37:29.666899 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p94lk" event={"ID":"6fe8ee60-f750-4170-932f-7dbf96f643e8","Type":"ContainerDied","Data":"7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819"} Mar 20 08:37:29 crc kubenswrapper[4903]: I0320 08:37:29.666734 4903 generic.go:334] "Generic (PLEG): container finished" podID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerID="7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819" exitCode=0 Mar 20 08:37:29 crc kubenswrapper[4903]: I0320 08:37:29.667178 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p94lk" event={"ID":"6fe8ee60-f750-4170-932f-7dbf96f643e8","Type":"ContainerStarted","Data":"cc4d3d136c3363edf7303931f0151222772aa5f2e814366b29ee536d3b84015a"} Mar 20 08:37:29 crc kubenswrapper[4903]: I0320 08:37:29.810381 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w5jv2"] Mar 20 08:37:29 crc kubenswrapper[4903]: W0320 08:37:29.832897 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34abeb10_0f26_4b84_8c5d_1867ab464452.slice/crio-6a46564e5fb7307db9408c45e5799f8b11c1c051c442401fad7bdf13f9c9856d WatchSource:0}: Error finding container 6a46564e5fb7307db9408c45e5799f8b11c1c051c442401fad7bdf13f9c9856d: Status 404 returned error can't find the container with id 6a46564e5fb7307db9408c45e5799f8b11c1c051c442401fad7bdf13f9c9856d Mar 20 08:37:29 crc kubenswrapper[4903]: I0320 08:37:29.846264 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ndjr5"] Mar 20 08:37:29 crc kubenswrapper[4903]: W0320 08:37:29.856098 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cc44780_3165_4335_9a85_4435d9b63ba1.slice/crio-b219df6c099925f660096e4a478a250240f10db8d58bb8435e445b94ee8e9343 WatchSource:0}: Error finding container b219df6c099925f660096e4a478a250240f10db8d58bb8435e445b94ee8e9343: Status 404 returned error can't find the container with id b219df6c099925f660096e4a478a250240f10db8d58bb8435e445b94ee8e9343 Mar 20 08:37:30 crc kubenswrapper[4903]: I0320 08:37:30.677924 4903 generic.go:334] "Generic (PLEG): container finished" podID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerID="44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644" exitCode=0 Mar 20 08:37:30 crc kubenswrapper[4903]: I0320 08:37:30.677993 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w5jv2" event={"ID":"34abeb10-0f26-4b84-8c5d-1867ab464452","Type":"ContainerDied","Data":"44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644"} Mar 20 08:37:30 crc kubenswrapper[4903]: I0320 08:37:30.679564 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w5jv2" event={"ID":"34abeb10-0f26-4b84-8c5d-1867ab464452","Type":"ContainerStarted","Data":"6a46564e5fb7307db9408c45e5799f8b11c1c051c442401fad7bdf13f9c9856d"} Mar 20 08:37:30 crc kubenswrapper[4903]: I0320 08:37:30.684024 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ndjr5" event={"ID":"6cc44780-3165-4335-9a85-4435d9b63ba1","Type":"ContainerStarted","Data":"b219df6c099925f660096e4a478a250240f10db8d58bb8435e445b94ee8e9343"} Mar 20 08:37:31 crc kubenswrapper[4903]: I0320 08:37:31.695960 4903 generic.go:334] "Generic (PLEG): container finished" podID="6cc44780-3165-4335-9a85-4435d9b63ba1" containerID="aa4274d0ae60291353fd62fefe5e5ec5c249f462114ca60bce326d73a4f0255e" exitCode=0 Mar 20 08:37:31 crc kubenswrapper[4903]: I0320 08:37:31.696207 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ndjr5" event={"ID":"6cc44780-3165-4335-9a85-4435d9b63ba1","Type":"ContainerDied","Data":"aa4274d0ae60291353fd62fefe5e5ec5c249f462114ca60bce326d73a4f0255e"} Mar 20 08:37:31 crc kubenswrapper[4903]: I0320 08:37:31.699301 4903 generic.go:334] "Generic (PLEG): container finished" podID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerID="7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540" exitCode=0 Mar 20 08:37:31 crc kubenswrapper[4903]: I0320 08:37:31.699391 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p94lk" event={"ID":"6fe8ee60-f750-4170-932f-7dbf96f643e8","Type":"ContainerDied","Data":"7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540"} Mar 20 08:37:31 crc kubenswrapper[4903]: I0320 08:37:31.704979 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w5jv2" event={"ID":"34abeb10-0f26-4b84-8c5d-1867ab464452","Type":"ContainerStarted","Data":"19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d"} Mar 20 08:37:32 crc kubenswrapper[4903]: I0320 08:37:32.714312 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p94lk" event={"ID":"6fe8ee60-f750-4170-932f-7dbf96f643e8","Type":"ContainerStarted","Data":"8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724"} Mar 20 08:37:32 crc kubenswrapper[4903]: I0320 08:37:32.717383 4903 generic.go:334] "Generic (PLEG): container finished" podID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerID="19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d" exitCode=0 Mar 20 08:37:32 crc kubenswrapper[4903]: I0320 08:37:32.717493 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w5jv2" event={"ID":"34abeb10-0f26-4b84-8c5d-1867ab464452","Type":"ContainerDied","Data":"19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d"} Mar 20 08:37:32 crc kubenswrapper[4903]: I0320 08:37:32.738999 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p94lk" podStartSLOduration=18.265167049 podStartE2EDuration="20.738976726s" podCreationTimestamp="2026-03-20 08:37:12 +0000 UTC" firstStartedPulling="2026-03-20 08:37:29.673326632 +0000 UTC m=+874.890226967" lastFinishedPulling="2026-03-20 08:37:32.147136289 +0000 UTC m=+877.364036644" observedRunningTime="2026-03-20 08:37:32.738462263 +0000 UTC m=+877.955362578" watchObservedRunningTime="2026-03-20 08:37:32.738976726 +0000 UTC m=+877.955877041" Mar 20 08:37:32 crc kubenswrapper[4903]: I0320 08:37:32.819022 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:32 crc kubenswrapper[4903]: I0320 08:37:32.819108 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.004018 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.095132 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6cc44780-3165-4335-9a85-4435d9b63ba1-crc-storage\") pod \"6cc44780-3165-4335-9a85-4435d9b63ba1\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.095231 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g78f7\" (UniqueName: \"kubernetes.io/projected/6cc44780-3165-4335-9a85-4435d9b63ba1-kube-api-access-g78f7\") pod \"6cc44780-3165-4335-9a85-4435d9b63ba1\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.095294 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6cc44780-3165-4335-9a85-4435d9b63ba1-node-mnt\") pod \"6cc44780-3165-4335-9a85-4435d9b63ba1\" (UID: \"6cc44780-3165-4335-9a85-4435d9b63ba1\") " Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.095493 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cc44780-3165-4335-9a85-4435d9b63ba1-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6cc44780-3165-4335-9a85-4435d9b63ba1" (UID: "6cc44780-3165-4335-9a85-4435d9b63ba1"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.096022 4903 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6cc44780-3165-4335-9a85-4435d9b63ba1-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.103549 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc44780-3165-4335-9a85-4435d9b63ba1-kube-api-access-g78f7" (OuterVolumeSpecName: "kube-api-access-g78f7") pod "6cc44780-3165-4335-9a85-4435d9b63ba1" (UID: "6cc44780-3165-4335-9a85-4435d9b63ba1"). InnerVolumeSpecName "kube-api-access-g78f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.125456 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cc44780-3165-4335-9a85-4435d9b63ba1-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6cc44780-3165-4335-9a85-4435d9b63ba1" (UID: "6cc44780-3165-4335-9a85-4435d9b63ba1"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.198081 4903 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6cc44780-3165-4335-9a85-4435d9b63ba1-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.198153 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g78f7\" (UniqueName: \"kubernetes.io/projected/6cc44780-3165-4335-9a85-4435d9b63ba1-kube-api-access-g78f7\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.732220 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w5jv2" event={"ID":"34abeb10-0f26-4b84-8c5d-1867ab464452","Type":"ContainerStarted","Data":"8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668"} Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.734701 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ndjr5" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.734771 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ndjr5" event={"ID":"6cc44780-3165-4335-9a85-4435d9b63ba1","Type":"ContainerDied","Data":"b219df6c099925f660096e4a478a250240f10db8d58bb8435e445b94ee8e9343"} Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.734806 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b219df6c099925f660096e4a478a250240f10db8d58bb8435e445b94ee8e9343" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.759006 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w5jv2" podStartSLOduration=17.217744952 podStartE2EDuration="19.758980901s" podCreationTimestamp="2026-03-20 08:37:14 +0000 UTC" firstStartedPulling="2026-03-20 08:37:30.681489747 +0000 UTC m=+875.898390062" lastFinishedPulling="2026-03-20 08:37:33.222725696 +0000 UTC m=+878.439626011" observedRunningTime="2026-03-20 08:37:33.755867763 +0000 UTC m=+878.972768088" watchObservedRunningTime="2026-03-20 08:37:33.758980901 +0000 UTC m=+878.975881226" Mar 20 08:37:33 crc kubenswrapper[4903]: I0320 08:37:33.894334 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p94lk" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerName="registry-server" probeResult="failure" output=< Mar 20 08:37:33 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Mar 20 08:37:33 crc kubenswrapper[4903]: > Mar 20 08:37:34 crc kubenswrapper[4903]: I0320 08:37:34.963897 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:34 crc kubenswrapper[4903]: I0320 08:37:34.964353 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:36 crc kubenswrapper[4903]: I0320 08:37:36.036766 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-w5jv2" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerName="registry-server" probeResult="failure" output=< Mar 20 08:37:36 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Mar 20 08:37:36 crc kubenswrapper[4903]: > Mar 20 08:37:39 crc kubenswrapper[4903]: I0320 08:37:39.067287 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ljsfv" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.625374 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v"] Mar 20 08:37:41 crc kubenswrapper[4903]: E0320 08:37:41.627328 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d878da59-76f0-4401-a6f4-50d6448bff24" containerName="extract-content" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.627532 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d878da59-76f0-4401-a6f4-50d6448bff24" containerName="extract-content" Mar 20 08:37:41 crc kubenswrapper[4903]: E0320 08:37:41.627707 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d878da59-76f0-4401-a6f4-50d6448bff24" containerName="registry-server" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.629414 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d878da59-76f0-4401-a6f4-50d6448bff24" containerName="registry-server" Mar 20 08:37:41 crc kubenswrapper[4903]: E0320 08:37:41.629604 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d878da59-76f0-4401-a6f4-50d6448bff24" containerName="extract-utilities" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.629732 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d878da59-76f0-4401-a6f4-50d6448bff24" containerName="extract-utilities" Mar 20 08:37:41 crc kubenswrapper[4903]: E0320 08:37:41.629864 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cc44780-3165-4335-9a85-4435d9b63ba1" containerName="storage" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.629966 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc44780-3165-4335-9a85-4435d9b63ba1" containerName="storage" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.630293 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cc44780-3165-4335-9a85-4435d9b63ba1" containerName="storage" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.630442 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d878da59-76f0-4401-a6f4-50d6448bff24" containerName="registry-server" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.632160 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.635643 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.644714 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v"] Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.826160 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.826314 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jbn5\" (UniqueName: \"kubernetes.io/projected/b982e00f-fa44-456d-8551-79654a44bfce-kube-api-access-7jbn5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.826371 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.927915 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jbn5\" (UniqueName: \"kubernetes.io/projected/b982e00f-fa44-456d-8551-79654a44bfce-kube-api-access-7jbn5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.928012 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.928277 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.929295 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.929286 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:41 crc kubenswrapper[4903]: I0320 08:37:41.963239 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jbn5\" (UniqueName: \"kubernetes.io/projected/b982e00f-fa44-456d-8551-79654a44bfce-kube-api-access-7jbn5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:42 crc kubenswrapper[4903]: I0320 08:37:42.258411 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:42 crc kubenswrapper[4903]: I0320 08:37:42.559913 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v"] Mar 20 08:37:42 crc kubenswrapper[4903]: I0320 08:37:42.801619 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" event={"ID":"b982e00f-fa44-456d-8551-79654a44bfce","Type":"ContainerStarted","Data":"8485e7227a744b754550cc261fb2f0b886c742dd70ebf51129bcafc8c4b41d9a"} Mar 20 08:37:42 crc kubenswrapper[4903]: I0320 08:37:42.801693 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" event={"ID":"b982e00f-fa44-456d-8551-79654a44bfce","Type":"ContainerStarted","Data":"14f1217c1d9a2d29e966dd7b62a0967266ce01211267ef1e6aeafd72e5c10490"} Mar 20 08:37:42 crc kubenswrapper[4903]: I0320 08:37:42.865993 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:42 crc kubenswrapper[4903]: I0320 08:37:42.915067 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:43 crc kubenswrapper[4903]: I0320 08:37:43.634763 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p94lk"] Mar 20 08:37:43 crc kubenswrapper[4903]: I0320 08:37:43.811452 4903 generic.go:334] "Generic (PLEG): container finished" podID="b982e00f-fa44-456d-8551-79654a44bfce" containerID="8485e7227a744b754550cc261fb2f0b886c742dd70ebf51129bcafc8c4b41d9a" exitCode=0 Mar 20 08:37:43 crc kubenswrapper[4903]: I0320 08:37:43.811523 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" event={"ID":"b982e00f-fa44-456d-8551-79654a44bfce","Type":"ContainerDied","Data":"8485e7227a744b754550cc261fb2f0b886c742dd70ebf51129bcafc8c4b41d9a"} Mar 20 08:37:44 crc kubenswrapper[4903]: I0320 08:37:44.819762 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p94lk" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerName="registry-server" containerID="cri-o://8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724" gracePeriod=2 Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.036872 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.103556 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.328235 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.483459 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-utilities\") pod \"6fe8ee60-f750-4170-932f-7dbf96f643e8\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.483970 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-catalog-content\") pod \"6fe8ee60-f750-4170-932f-7dbf96f643e8\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.484245 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xrgr\" (UniqueName: \"kubernetes.io/projected/6fe8ee60-f750-4170-932f-7dbf96f643e8-kube-api-access-9xrgr\") pod \"6fe8ee60-f750-4170-932f-7dbf96f643e8\" (UID: \"6fe8ee60-f750-4170-932f-7dbf96f643e8\") " Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.484776 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-utilities" (OuterVolumeSpecName: "utilities") pod "6fe8ee60-f750-4170-932f-7dbf96f643e8" (UID: "6fe8ee60-f750-4170-932f-7dbf96f643e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.493391 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe8ee60-f750-4170-932f-7dbf96f643e8-kube-api-access-9xrgr" (OuterVolumeSpecName: "kube-api-access-9xrgr") pod "6fe8ee60-f750-4170-932f-7dbf96f643e8" (UID: "6fe8ee60-f750-4170-932f-7dbf96f643e8"). InnerVolumeSpecName "kube-api-access-9xrgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.586340 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.586394 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xrgr\" (UniqueName: \"kubernetes.io/projected/6fe8ee60-f750-4170-932f-7dbf96f643e8-kube-api-access-9xrgr\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.652193 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fe8ee60-f750-4170-932f-7dbf96f643e8" (UID: "6fe8ee60-f750-4170-932f-7dbf96f643e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.688362 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fe8ee60-f750-4170-932f-7dbf96f643e8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.836780 4903 generic.go:334] "Generic (PLEG): container finished" podID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerID="8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724" exitCode=0 Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.836960 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p94lk" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.837083 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p94lk" event={"ID":"6fe8ee60-f750-4170-932f-7dbf96f643e8","Type":"ContainerDied","Data":"8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724"} Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.837152 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p94lk" event={"ID":"6fe8ee60-f750-4170-932f-7dbf96f643e8","Type":"ContainerDied","Data":"cc4d3d136c3363edf7303931f0151222772aa5f2e814366b29ee536d3b84015a"} Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.837201 4903 scope.go:117] "RemoveContainer" containerID="8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.845385 4903 generic.go:334] "Generic (PLEG): container finished" podID="b982e00f-fa44-456d-8551-79654a44bfce" containerID="8be0926cd1aa86b49c40757f1845b65ff38267bf289fcfdf61b50abf27d3ecc5" exitCode=0 Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.846173 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" event={"ID":"b982e00f-fa44-456d-8551-79654a44bfce","Type":"ContainerDied","Data":"8be0926cd1aa86b49c40757f1845b65ff38267bf289fcfdf61b50abf27d3ecc5"} Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.867614 4903 scope.go:117] "RemoveContainer" containerID="7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.910479 4903 scope.go:117] "RemoveContainer" containerID="7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.917395 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p94lk"] Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.925136 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p94lk"] Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.930709 4903 scope.go:117] "RemoveContainer" containerID="8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724" Mar 20 08:37:45 crc kubenswrapper[4903]: E0320 08:37:45.931252 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724\": container with ID starting with 8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724 not found: ID does not exist" containerID="8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.931293 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724"} err="failed to get container status \"8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724\": rpc error: code = NotFound desc = could not find container \"8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724\": container with ID starting with 8b3eb078de88fc5c3908aef53f22b32f359ee75402c5c7b56f4582f855ac9724 not found: ID does not exist" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.931323 4903 scope.go:117] "RemoveContainer" containerID="7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540" Mar 20 08:37:45 crc kubenswrapper[4903]: E0320 08:37:45.931760 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540\": container with ID starting with 7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540 not found: ID does not exist" containerID="7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.931886 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540"} err="failed to get container status \"7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540\": rpc error: code = NotFound desc = could not find container \"7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540\": container with ID starting with 7d2e9925a2171c113bc0b0a8f8404c6fea89abe5843eabd4fc5fbbcae44e3540 not found: ID does not exist" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.931979 4903 scope.go:117] "RemoveContainer" containerID="7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819" Mar 20 08:37:45 crc kubenswrapper[4903]: E0320 08:37:45.932416 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819\": container with ID starting with 7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819 not found: ID does not exist" containerID="7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819" Mar 20 08:37:45 crc kubenswrapper[4903]: I0320 08:37:45.932451 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819"} err="failed to get container status \"7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819\": rpc error: code = NotFound desc = could not find container \"7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819\": container with ID starting with 7a5b5a5a80b2c5c88cb2af00851ef70deaccf6a756945ad4b7bc1418f1320819 not found: ID does not exist" Mar 20 08:37:46 crc kubenswrapper[4903]: I0320 08:37:46.858591 4903 generic.go:334] "Generic (PLEG): container finished" podID="b982e00f-fa44-456d-8551-79654a44bfce" containerID="fdd467257516736a3db02971863e747b3a69216f87a2e35c59e7ff5b1d0b5f44" exitCode=0 Mar 20 08:37:46 crc kubenswrapper[4903]: I0320 08:37:46.858650 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" event={"ID":"b982e00f-fa44-456d-8551-79654a44bfce","Type":"ContainerDied","Data":"fdd467257516736a3db02971863e747b3a69216f87a2e35c59e7ff5b1d0b5f44"} Mar 20 08:37:47 crc kubenswrapper[4903]: I0320 08:37:47.499677 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" path="/var/lib/kubelet/pods/6fe8ee60-f750-4170-932f-7dbf96f643e8/volumes" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.039704 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w5jv2"] Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.040437 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w5jv2" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerName="registry-server" containerID="cri-o://8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668" gracePeriod=2 Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.210026 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.333840 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-bundle\") pod \"b982e00f-fa44-456d-8551-79654a44bfce\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.333957 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-util\") pod \"b982e00f-fa44-456d-8551-79654a44bfce\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.333980 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jbn5\" (UniqueName: \"kubernetes.io/projected/b982e00f-fa44-456d-8551-79654a44bfce-kube-api-access-7jbn5\") pod \"b982e00f-fa44-456d-8551-79654a44bfce\" (UID: \"b982e00f-fa44-456d-8551-79654a44bfce\") " Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.334989 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-bundle" (OuterVolumeSpecName: "bundle") pod "b982e00f-fa44-456d-8551-79654a44bfce" (UID: "b982e00f-fa44-456d-8551-79654a44bfce"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.352765 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b982e00f-fa44-456d-8551-79654a44bfce-kube-api-access-7jbn5" (OuterVolumeSpecName: "kube-api-access-7jbn5") pod "b982e00f-fa44-456d-8551-79654a44bfce" (UID: "b982e00f-fa44-456d-8551-79654a44bfce"). InnerVolumeSpecName "kube-api-access-7jbn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.389526 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.435924 4903 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.436008 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jbn5\" (UniqueName: \"kubernetes.io/projected/b982e00f-fa44-456d-8551-79654a44bfce-kube-api-access-7jbn5\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.536933 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-utilities\") pod \"34abeb10-0f26-4b84-8c5d-1867ab464452\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.537059 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdth4\" (UniqueName: \"kubernetes.io/projected/34abeb10-0f26-4b84-8c5d-1867ab464452-kube-api-access-zdth4\") pod \"34abeb10-0f26-4b84-8c5d-1867ab464452\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.537170 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-catalog-content\") pod \"34abeb10-0f26-4b84-8c5d-1867ab464452\" (UID: \"34abeb10-0f26-4b84-8c5d-1867ab464452\") " Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.538773 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-utilities" (OuterVolumeSpecName: "utilities") pod "34abeb10-0f26-4b84-8c5d-1867ab464452" (UID: "34abeb10-0f26-4b84-8c5d-1867ab464452"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.544924 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34abeb10-0f26-4b84-8c5d-1867ab464452-kube-api-access-zdth4" (OuterVolumeSpecName: "kube-api-access-zdth4") pod "34abeb10-0f26-4b84-8c5d-1867ab464452" (UID: "34abeb10-0f26-4b84-8c5d-1867ab464452"). InnerVolumeSpecName "kube-api-access-zdth4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.626467 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-util" (OuterVolumeSpecName: "util") pod "b982e00f-fa44-456d-8551-79654a44bfce" (UID: "b982e00f-fa44-456d-8551-79654a44bfce"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.635155 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34abeb10-0f26-4b84-8c5d-1867ab464452" (UID: "34abeb10-0f26-4b84-8c5d-1867ab464452"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.639021 4903 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b982e00f-fa44-456d-8551-79654a44bfce-util\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.639172 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.639275 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdth4\" (UniqueName: \"kubernetes.io/projected/34abeb10-0f26-4b84-8c5d-1867ab464452-kube-api-access-zdth4\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.639363 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34abeb10-0f26-4b84-8c5d-1867ab464452-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.876196 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.876195 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v" event={"ID":"b982e00f-fa44-456d-8551-79654a44bfce","Type":"ContainerDied","Data":"14f1217c1d9a2d29e966dd7b62a0967266ce01211267ef1e6aeafd72e5c10490"} Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.876415 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14f1217c1d9a2d29e966dd7b62a0967266ce01211267ef1e6aeafd72e5c10490" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.879793 4903 generic.go:334] "Generic (PLEG): container finished" podID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerID="8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668" exitCode=0 Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.879863 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w5jv2" event={"ID":"34abeb10-0f26-4b84-8c5d-1867ab464452","Type":"ContainerDied","Data":"8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668"} Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.879900 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w5jv2" event={"ID":"34abeb10-0f26-4b84-8c5d-1867ab464452","Type":"ContainerDied","Data":"6a46564e5fb7307db9408c45e5799f8b11c1c051c442401fad7bdf13f9c9856d"} Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.879932 4903 scope.go:117] "RemoveContainer" containerID="8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.880172 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w5jv2" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.905016 4903 scope.go:117] "RemoveContainer" containerID="19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.928937 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w5jv2"] Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.936065 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w5jv2"] Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.951178 4903 scope.go:117] "RemoveContainer" containerID="44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.966368 4903 scope.go:117] "RemoveContainer" containerID="8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668" Mar 20 08:37:48 crc kubenswrapper[4903]: E0320 08:37:48.966925 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668\": container with ID starting with 8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668 not found: ID does not exist" containerID="8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.966973 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668"} err="failed to get container status \"8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668\": rpc error: code = NotFound desc = could not find container \"8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668\": container with ID starting with 8b908cbd4612dee9f3e34d412f4fa7fbc412ef192d6533baf78f769c28f94668 not found: ID does not exist" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.967004 4903 scope.go:117] "RemoveContainer" containerID="19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d" Mar 20 08:37:48 crc kubenswrapper[4903]: E0320 08:37:48.967657 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d\": container with ID starting with 19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d not found: ID does not exist" containerID="19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.967712 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d"} err="failed to get container status \"19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d\": rpc error: code = NotFound desc = could not find container \"19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d\": container with ID starting with 19f55c90addf693064884e324f012005f5aadc90baeaea9000af684b1485670d not found: ID does not exist" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.967749 4903 scope.go:117] "RemoveContainer" containerID="44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644" Mar 20 08:37:48 crc kubenswrapper[4903]: E0320 08:37:48.968199 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644\": container with ID starting with 44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644 not found: ID does not exist" containerID="44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644" Mar 20 08:37:48 crc kubenswrapper[4903]: I0320 08:37:48.968240 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644"} err="failed to get container status \"44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644\": rpc error: code = NotFound desc = could not find container \"44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644\": container with ID starting with 44e9c1825aa56d6573ead0de1f536ac3b261384812cfe7c88981a5f70d6cc644 not found: ID does not exist" Mar 20 08:37:49 crc kubenswrapper[4903]: I0320 08:37:49.500676 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" path="/var/lib/kubelet/pods/34abeb10-0f26-4b84-8c5d-1867ab464452/volumes" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384208 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r"] Mar 20 08:37:50 crc kubenswrapper[4903]: E0320 08:37:50.384502 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerName="extract-utilities" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384521 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerName="extract-utilities" Mar 20 08:37:50 crc kubenswrapper[4903]: E0320 08:37:50.384537 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b982e00f-fa44-456d-8551-79654a44bfce" containerName="extract" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384546 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b982e00f-fa44-456d-8551-79654a44bfce" containerName="extract" Mar 20 08:37:50 crc kubenswrapper[4903]: E0320 08:37:50.384559 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b982e00f-fa44-456d-8551-79654a44bfce" containerName="pull" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384567 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b982e00f-fa44-456d-8551-79654a44bfce" containerName="pull" Mar 20 08:37:50 crc kubenswrapper[4903]: E0320 08:37:50.384581 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerName="extract-content" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384589 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerName="extract-content" Mar 20 08:37:50 crc kubenswrapper[4903]: E0320 08:37:50.384603 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerName="registry-server" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384611 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerName="registry-server" Mar 20 08:37:50 crc kubenswrapper[4903]: E0320 08:37:50.384624 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b982e00f-fa44-456d-8551-79654a44bfce" containerName="util" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384632 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b982e00f-fa44-456d-8551-79654a44bfce" containerName="util" Mar 20 08:37:50 crc kubenswrapper[4903]: E0320 08:37:50.384644 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerName="extract-content" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384652 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerName="extract-content" Mar 20 08:37:50 crc kubenswrapper[4903]: E0320 08:37:50.384665 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerName="extract-utilities" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384674 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerName="extract-utilities" Mar 20 08:37:50 crc kubenswrapper[4903]: E0320 08:37:50.384692 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerName="registry-server" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384700 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerName="registry-server" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384816 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe8ee60-f750-4170-932f-7dbf96f643e8" containerName="registry-server" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384827 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b982e00f-fa44-456d-8551-79654a44bfce" containerName="extract" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.384842 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="34abeb10-0f26-4b84-8c5d-1867ab464452" containerName="registry-server" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.385332 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.387820 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.387834 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rfgbx" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.387910 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.399416 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r"] Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.570822 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp6gc\" (UniqueName: \"kubernetes.io/projected/737e090b-af32-421c-badd-b1decc1ace3c-kube-api-access-vp6gc\") pod \"nmstate-operator-796d4cfff4-h2c5r\" (UID: \"737e090b-af32-421c-badd-b1decc1ace3c\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.672815 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6gc\" (UniqueName: \"kubernetes.io/projected/737e090b-af32-421c-badd-b1decc1ace3c-kube-api-access-vp6gc\") pod \"nmstate-operator-796d4cfff4-h2c5r\" (UID: \"737e090b-af32-421c-badd-b1decc1ace3c\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.708455 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6gc\" (UniqueName: \"kubernetes.io/projected/737e090b-af32-421c-badd-b1decc1ace3c-kube-api-access-vp6gc\") pod \"nmstate-operator-796d4cfff4-h2c5r\" (UID: \"737e090b-af32-421c-badd-b1decc1ace3c\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r" Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.834129 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:37:50 crc kubenswrapper[4903]: I0320 08:37:50.834318 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:37:51 crc kubenswrapper[4903]: I0320 08:37:51.007457 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r" Mar 20 08:37:51 crc kubenswrapper[4903]: I0320 08:37:51.271608 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r"] Mar 20 08:37:51 crc kubenswrapper[4903]: I0320 08:37:51.905043 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r" event={"ID":"737e090b-af32-421c-badd-b1decc1ace3c","Type":"ContainerStarted","Data":"b4ba5c6f907b09f828d922958a8a867e5a4306d22bace49b5cb1181f92e041ae"} Mar 20 08:37:53 crc kubenswrapper[4903]: I0320 08:37:53.934261 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r" event={"ID":"737e090b-af32-421c-badd-b1decc1ace3c","Type":"ContainerStarted","Data":"87a19191aeac136df33d24471852d2de3421783c02783507f74761834d9b187e"} Mar 20 08:37:53 crc kubenswrapper[4903]: I0320 08:37:53.955158 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-h2c5r" podStartSLOduration=1.854032738 podStartE2EDuration="3.955133268s" podCreationTimestamp="2026-03-20 08:37:50 +0000 UTC" firstStartedPulling="2026-03-20 08:37:51.27287723 +0000 UTC m=+896.489777545" lastFinishedPulling="2026-03-20 08:37:53.37397776 +0000 UTC m=+898.590878075" observedRunningTime="2026-03-20 08:37:53.951922077 +0000 UTC m=+899.168822402" watchObservedRunningTime="2026-03-20 08:37:53.955133268 +0000 UTC m=+899.172033593" Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.147574 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566598-fjtx9"] Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.149685 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566598-fjtx9" Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.152479 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.152514 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.154069 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.156754 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566598-fjtx9"] Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.224713 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8ck9\" (UniqueName: \"kubernetes.io/projected/04aba9e4-217f-489a-9c45-dde90c447aa6-kube-api-access-q8ck9\") pod \"auto-csr-approver-29566598-fjtx9\" (UID: \"04aba9e4-217f-489a-9c45-dde90c447aa6\") " pod="openshift-infra/auto-csr-approver-29566598-fjtx9" Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.326572 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8ck9\" (UniqueName: \"kubernetes.io/projected/04aba9e4-217f-489a-9c45-dde90c447aa6-kube-api-access-q8ck9\") pod \"auto-csr-approver-29566598-fjtx9\" (UID: \"04aba9e4-217f-489a-9c45-dde90c447aa6\") " pod="openshift-infra/auto-csr-approver-29566598-fjtx9" Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.356073 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8ck9\" (UniqueName: \"kubernetes.io/projected/04aba9e4-217f-489a-9c45-dde90c447aa6-kube-api-access-q8ck9\") pod \"auto-csr-approver-29566598-fjtx9\" (UID: \"04aba9e4-217f-489a-9c45-dde90c447aa6\") " pod="openshift-infra/auto-csr-approver-29566598-fjtx9" Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.486578 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566598-fjtx9" Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.754286 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566598-fjtx9"] Mar 20 08:38:00 crc kubenswrapper[4903]: I0320 08:38:00.993099 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566598-fjtx9" event={"ID":"04aba9e4-217f-489a-9c45-dde90c447aa6","Type":"ContainerStarted","Data":"7f36470fac23a22d85a53a554b6d564864d7c00b0decb2b5138fb9a624007f9e"} Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.717390 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-df94z"] Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.732913 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-df94z"] Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.733060 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-df94z" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.735553 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-stgj6" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.751097 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9"] Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.751846 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.755418 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.769543 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-nmw5s"] Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.772458 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.789877 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9"] Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.854233 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk"] Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.855841 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.856535 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjm6z\" (UniqueName: \"kubernetes.io/projected/20f0bdd3-2ffc-46fa-bd7c-ed3644379f08-kube-api-access-pjm6z\") pod \"nmstate-webhook-5f558f5558-mxxc9\" (UID: \"20f0bdd3-2ffc-46fa-bd7c-ed3644379f08\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.856598 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/20f0bdd3-2ffc-46fa-bd7c-ed3644379f08-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mxxc9\" (UID: \"20f0bdd3-2ffc-46fa-bd7c-ed3644379f08\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.856636 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stdrp\" (UniqueName: \"kubernetes.io/projected/82dbfa21-7bc0-4311-bcb5-3f746b288130-kube-api-access-stdrp\") pod \"nmstate-metrics-9b8c8685d-df94z\" (UID: \"82dbfa21-7bc0-4311-bcb5-3f746b288130\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-df94z" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.859287 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.859351 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.859438 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-zxbls" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.874607 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk"] Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.958281 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjm6z\" (UniqueName: \"kubernetes.io/projected/20f0bdd3-2ffc-46fa-bd7c-ed3644379f08-kube-api-access-pjm6z\") pod \"nmstate-webhook-5f558f5558-mxxc9\" (UID: \"20f0bdd3-2ffc-46fa-bd7c-ed3644379f08\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.958373 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5d5m\" (UniqueName: \"kubernetes.io/projected/75a183a7-68ce-40f4-b663-8d8845fedb36-kube-api-access-q5d5m\") pod \"nmstate-console-plugin-86f58fcf4-f8dnk\" (UID: \"75a183a7-68ce-40f4-b663-8d8845fedb36\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.958436 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/20f0bdd3-2ffc-46fa-bd7c-ed3644379f08-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mxxc9\" (UID: \"20f0bdd3-2ffc-46fa-bd7c-ed3644379f08\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.958466 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stdrp\" (UniqueName: \"kubernetes.io/projected/82dbfa21-7bc0-4311-bcb5-3f746b288130-kube-api-access-stdrp\") pod \"nmstate-metrics-9b8c8685d-df94z\" (UID: \"82dbfa21-7bc0-4311-bcb5-3f746b288130\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-df94z" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.958494 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4eb953be-dfbf-4a30-bed9-90abab5fb73c-dbus-socket\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.958524 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4eb953be-dfbf-4a30-bed9-90abab5fb73c-ovs-socket\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.958547 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kltbc\" (UniqueName: \"kubernetes.io/projected/4eb953be-dfbf-4a30-bed9-90abab5fb73c-kube-api-access-kltbc\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.958575 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/75a183a7-68ce-40f4-b663-8d8845fedb36-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-f8dnk\" (UID: \"75a183a7-68ce-40f4-b663-8d8845fedb36\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.958602 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/75a183a7-68ce-40f4-b663-8d8845fedb36-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-f8dnk\" (UID: \"75a183a7-68ce-40f4-b663-8d8845fedb36\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.958628 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4eb953be-dfbf-4a30-bed9-90abab5fb73c-nmstate-lock\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:01 crc kubenswrapper[4903]: E0320 08:38:01.958926 4903 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 20 08:38:01 crc kubenswrapper[4903]: E0320 08:38:01.958992 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20f0bdd3-2ffc-46fa-bd7c-ed3644379f08-tls-key-pair podName:20f0bdd3-2ffc-46fa-bd7c-ed3644379f08 nodeName:}" failed. No retries permitted until 2026-03-20 08:38:02.458970689 +0000 UTC m=+907.675871004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/20f0bdd3-2ffc-46fa-bd7c-ed3644379f08-tls-key-pair") pod "nmstate-webhook-5f558f5558-mxxc9" (UID: "20f0bdd3-2ffc-46fa-bd7c-ed3644379f08") : secret "openshift-nmstate-webhook" not found Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.985354 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjm6z\" (UniqueName: \"kubernetes.io/projected/20f0bdd3-2ffc-46fa-bd7c-ed3644379f08-kube-api-access-pjm6z\") pod \"nmstate-webhook-5f558f5558-mxxc9\" (UID: \"20f0bdd3-2ffc-46fa-bd7c-ed3644379f08\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:01 crc kubenswrapper[4903]: I0320 08:38:01.986889 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stdrp\" (UniqueName: \"kubernetes.io/projected/82dbfa21-7bc0-4311-bcb5-3f746b288130-kube-api-access-stdrp\") pod \"nmstate-metrics-9b8c8685d-df94z\" (UID: \"82dbfa21-7bc0-4311-bcb5-3f746b288130\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-df94z" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.002793 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566598-fjtx9" event={"ID":"04aba9e4-217f-489a-9c45-dde90c447aa6","Type":"ContainerStarted","Data":"3dc2d0cf6bffe11b6698132443c96584f1a291d336cd377ba64bf1b1c193b375"} Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.030522 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566598-fjtx9" podStartSLOduration=1.161107141 podStartE2EDuration="2.030501634s" podCreationTimestamp="2026-03-20 08:38:00 +0000 UTC" firstStartedPulling="2026-03-20 08:38:00.765454105 +0000 UTC m=+905.982354420" lastFinishedPulling="2026-03-20 08:38:01.634848598 +0000 UTC m=+906.851748913" observedRunningTime="2026-03-20 08:38:02.029248672 +0000 UTC m=+907.246148987" watchObservedRunningTime="2026-03-20 08:38:02.030501634 +0000 UTC m=+907.247401949" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.056970 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-8594cd6d8-w6ps6"] Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.057965 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.060800 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m9n8\" (UniqueName: \"kubernetes.io/projected/0637def5-45fb-48f4-af77-36609b2f8d2f-kube-api-access-7m9n8\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.060853 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-service-ca\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.060880 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-console-config\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.060910 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-oauth-serving-cert\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.060969 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5d5m\" (UniqueName: \"kubernetes.io/projected/75a183a7-68ce-40f4-b663-8d8845fedb36-kube-api-access-q5d5m\") pod \"nmstate-console-plugin-86f58fcf4-f8dnk\" (UID: \"75a183a7-68ce-40f4-b663-8d8845fedb36\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.061007 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0637def5-45fb-48f4-af77-36609b2f8d2f-console-oauth-config\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.061076 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4eb953be-dfbf-4a30-bed9-90abab5fb73c-dbus-socket\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.061113 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4eb953be-dfbf-4a30-bed9-90abab5fb73c-ovs-socket\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.061138 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kltbc\" (UniqueName: \"kubernetes.io/projected/4eb953be-dfbf-4a30-bed9-90abab5fb73c-kube-api-access-kltbc\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.061173 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0637def5-45fb-48f4-af77-36609b2f8d2f-console-serving-cert\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.061203 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/75a183a7-68ce-40f4-b663-8d8845fedb36-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-f8dnk\" (UID: \"75a183a7-68ce-40f4-b663-8d8845fedb36\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.061232 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/75a183a7-68ce-40f4-b663-8d8845fedb36-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-f8dnk\" (UID: \"75a183a7-68ce-40f4-b663-8d8845fedb36\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.061263 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4eb953be-dfbf-4a30-bed9-90abab5fb73c-nmstate-lock\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.061286 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-trusted-ca-bundle\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.062881 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4eb953be-dfbf-4a30-bed9-90abab5fb73c-dbus-socket\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.062929 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4eb953be-dfbf-4a30-bed9-90abab5fb73c-ovs-socket\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.063979 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/75a183a7-68ce-40f4-b663-8d8845fedb36-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-f8dnk\" (UID: \"75a183a7-68ce-40f4-b663-8d8845fedb36\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.072372 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/75a183a7-68ce-40f4-b663-8d8845fedb36-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-f8dnk\" (UID: \"75a183a7-68ce-40f4-b663-8d8845fedb36\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.072795 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4eb953be-dfbf-4a30-bed9-90abab5fb73c-nmstate-lock\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.073883 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-df94z" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.089928 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5d5m\" (UniqueName: \"kubernetes.io/projected/75a183a7-68ce-40f4-b663-8d8845fedb36-kube-api-access-q5d5m\") pod \"nmstate-console-plugin-86f58fcf4-f8dnk\" (UID: \"75a183a7-68ce-40f4-b663-8d8845fedb36\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.092232 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8594cd6d8-w6ps6"] Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.099295 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kltbc\" (UniqueName: \"kubernetes.io/projected/4eb953be-dfbf-4a30-bed9-90abab5fb73c-kube-api-access-kltbc\") pod \"nmstate-handler-nmw5s\" (UID: \"4eb953be-dfbf-4a30-bed9-90abab5fb73c\") " pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.162907 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-service-ca\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.162946 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-console-config\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.162981 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-oauth-serving-cert\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.163027 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0637def5-45fb-48f4-af77-36609b2f8d2f-console-oauth-config\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.163097 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0637def5-45fb-48f4-af77-36609b2f8d2f-console-serving-cert\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.163129 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-trusted-ca-bundle\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.163157 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m9n8\" (UniqueName: \"kubernetes.io/projected/0637def5-45fb-48f4-af77-36609b2f8d2f-kube-api-access-7m9n8\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.164719 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-service-ca\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.165302 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-console-config\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.165859 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-oauth-serving-cert\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.166878 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0637def5-45fb-48f4-af77-36609b2f8d2f-trusted-ca-bundle\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.171987 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0637def5-45fb-48f4-af77-36609b2f8d2f-console-serving-cert\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.173960 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0637def5-45fb-48f4-af77-36609b2f8d2f-console-oauth-config\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.180938 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m9n8\" (UniqueName: \"kubernetes.io/projected/0637def5-45fb-48f4-af77-36609b2f8d2f-kube-api-access-7m9n8\") pod \"console-8594cd6d8-w6ps6\" (UID: \"0637def5-45fb-48f4-af77-36609b2f8d2f\") " pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.185377 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.344954 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-df94z"] Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.385911 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.420719 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.422998 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk"] Mar 20 08:38:02 crc kubenswrapper[4903]: W0320 08:38:02.427114 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75a183a7_68ce_40f4_b663_8d8845fedb36.slice/crio-67841b19e0a260dfa5fc69a4af363c036b338f295a55e2feb5a6eaebdccf7562 WatchSource:0}: Error finding container 67841b19e0a260dfa5fc69a4af363c036b338f295a55e2feb5a6eaebdccf7562: Status 404 returned error can't find the container with id 67841b19e0a260dfa5fc69a4af363c036b338f295a55e2feb5a6eaebdccf7562 Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.473444 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/20f0bdd3-2ffc-46fa-bd7c-ed3644379f08-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mxxc9\" (UID: \"20f0bdd3-2ffc-46fa-bd7c-ed3644379f08\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.480307 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/20f0bdd3-2ffc-46fa-bd7c-ed3644379f08-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mxxc9\" (UID: \"20f0bdd3-2ffc-46fa-bd7c-ed3644379f08\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:02 crc kubenswrapper[4903]: I0320 08:38:02.672730 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:03 crc kubenswrapper[4903]: I0320 08:38:03.076472 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-nmw5s" event={"ID":"4eb953be-dfbf-4a30-bed9-90abab5fb73c","Type":"ContainerStarted","Data":"33c764943a2314d68995951c7de5c30e6cb1ce404b25d955eed4c9c2ac67ccaf"} Mar 20 08:38:03 crc kubenswrapper[4903]: I0320 08:38:03.131265 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566598-fjtx9" event={"ID":"04aba9e4-217f-489a-9c45-dde90c447aa6","Type":"ContainerDied","Data":"3dc2d0cf6bffe11b6698132443c96584f1a291d336cd377ba64bf1b1c193b375"} Mar 20 08:38:03 crc kubenswrapper[4903]: I0320 08:38:03.131353 4903 generic.go:334] "Generic (PLEG): container finished" podID="04aba9e4-217f-489a-9c45-dde90c447aa6" containerID="3dc2d0cf6bffe11b6698132443c96584f1a291d336cd377ba64bf1b1c193b375" exitCode=0 Mar 20 08:38:03 crc kubenswrapper[4903]: I0320 08:38:03.134246 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-df94z" event={"ID":"82dbfa21-7bc0-4311-bcb5-3f746b288130","Type":"ContainerStarted","Data":"396bec175da833a7acaa18be9f47b53fe005465b3a4badea7e2560cdbdc20770"} Mar 20 08:38:03 crc kubenswrapper[4903]: I0320 08:38:03.134871 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8594cd6d8-w6ps6"] Mar 20 08:38:03 crc kubenswrapper[4903]: I0320 08:38:03.138895 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" event={"ID":"75a183a7-68ce-40f4-b663-8d8845fedb36","Type":"ContainerStarted","Data":"67841b19e0a260dfa5fc69a4af363c036b338f295a55e2feb5a6eaebdccf7562"} Mar 20 08:38:03 crc kubenswrapper[4903]: I0320 08:38:03.221241 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9"] Mar 20 08:38:03 crc kubenswrapper[4903]: W0320 08:38:03.230916 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f0bdd3_2ffc_46fa_bd7c_ed3644379f08.slice/crio-bb285d954de27d4708e2eca91916127a732a51f661f5bea69a33de37d88ffec4 WatchSource:0}: Error finding container bb285d954de27d4708e2eca91916127a732a51f661f5bea69a33de37d88ffec4: Status 404 returned error can't find the container with id bb285d954de27d4708e2eca91916127a732a51f661f5bea69a33de37d88ffec4 Mar 20 08:38:04 crc kubenswrapper[4903]: I0320 08:38:04.335769 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" event={"ID":"20f0bdd3-2ffc-46fa-bd7c-ed3644379f08","Type":"ContainerStarted","Data":"bb285d954de27d4708e2eca91916127a732a51f661f5bea69a33de37d88ffec4"} Mar 20 08:38:04 crc kubenswrapper[4903]: I0320 08:38:04.341807 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8594cd6d8-w6ps6" event={"ID":"0637def5-45fb-48f4-af77-36609b2f8d2f","Type":"ContainerStarted","Data":"68dae69e9e56037d248f0131ae2930f86b2908d6e041ac0016c1d7c4b8c202e9"} Mar 20 08:38:04 crc kubenswrapper[4903]: I0320 08:38:04.341921 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8594cd6d8-w6ps6" event={"ID":"0637def5-45fb-48f4-af77-36609b2f8d2f","Type":"ContainerStarted","Data":"ab24fea4ddf23cd666cf7f116cfd70c46a4388caac03155d9b1f24e4aef7f40b"} Mar 20 08:38:04 crc kubenswrapper[4903]: I0320 08:38:04.370286 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8594cd6d8-w6ps6" podStartSLOduration=2.370244728 podStartE2EDuration="2.370244728s" podCreationTimestamp="2026-03-20 08:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:38:04.362665886 +0000 UTC m=+909.579566221" watchObservedRunningTime="2026-03-20 08:38:04.370244728 +0000 UTC m=+909.587145043" Mar 20 08:38:04 crc kubenswrapper[4903]: I0320 08:38:04.580413 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566598-fjtx9" Mar 20 08:38:04 crc kubenswrapper[4903]: I0320 08:38:04.606847 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8ck9\" (UniqueName: \"kubernetes.io/projected/04aba9e4-217f-489a-9c45-dde90c447aa6-kube-api-access-q8ck9\") pod \"04aba9e4-217f-489a-9c45-dde90c447aa6\" (UID: \"04aba9e4-217f-489a-9c45-dde90c447aa6\") " Mar 20 08:38:04 crc kubenswrapper[4903]: I0320 08:38:04.614172 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04aba9e4-217f-489a-9c45-dde90c447aa6-kube-api-access-q8ck9" (OuterVolumeSpecName: "kube-api-access-q8ck9") pod "04aba9e4-217f-489a-9c45-dde90c447aa6" (UID: "04aba9e4-217f-489a-9c45-dde90c447aa6"). InnerVolumeSpecName "kube-api-access-q8ck9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:38:04 crc kubenswrapper[4903]: I0320 08:38:04.709158 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8ck9\" (UniqueName: \"kubernetes.io/projected/04aba9e4-217f-489a-9c45-dde90c447aa6-kube-api-access-q8ck9\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:05 crc kubenswrapper[4903]: I0320 08:38:05.360245 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566598-fjtx9" event={"ID":"04aba9e4-217f-489a-9c45-dde90c447aa6","Type":"ContainerDied","Data":"7f36470fac23a22d85a53a554b6d564864d7c00b0decb2b5138fb9a624007f9e"} Mar 20 08:38:05 crc kubenswrapper[4903]: I0320 08:38:05.360310 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f36470fac23a22d85a53a554b6d564864d7c00b0decb2b5138fb9a624007f9e" Mar 20 08:38:05 crc kubenswrapper[4903]: I0320 08:38:05.360338 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566592-jvzjg"] Mar 20 08:38:05 crc kubenswrapper[4903]: I0320 08:38:05.360407 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566598-fjtx9" Mar 20 08:38:05 crc kubenswrapper[4903]: I0320 08:38:05.363989 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566592-jvzjg"] Mar 20 08:38:05 crc kubenswrapper[4903]: I0320 08:38:05.500927 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4fb82db-1d81-491c-8fb7-8a6072b3335a" path="/var/lib/kubelet/pods/a4fb82db-1d81-491c-8fb7-8a6072b3335a/volumes" Mar 20 08:38:07 crc kubenswrapper[4903]: I0320 08:38:07.392110 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-nmw5s" event={"ID":"4eb953be-dfbf-4a30-bed9-90abab5fb73c","Type":"ContainerStarted","Data":"b15b4cd992995c3d628189cc0812058cdf23b17450871389da8d8c73dcbec4ad"} Mar 20 08:38:07 crc kubenswrapper[4903]: I0320 08:38:07.392511 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:07 crc kubenswrapper[4903]: I0320 08:38:07.394744 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" event={"ID":"20f0bdd3-2ffc-46fa-bd7c-ed3644379f08","Type":"ContainerStarted","Data":"a607103a4549283d1915f87d991146eeff17698fde2baa692d0e6613112e98df"} Mar 20 08:38:07 crc kubenswrapper[4903]: I0320 08:38:07.395215 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:07 crc kubenswrapper[4903]: I0320 08:38:07.396830 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-df94z" event={"ID":"82dbfa21-7bc0-4311-bcb5-3f746b288130","Type":"ContainerStarted","Data":"01f12a899afd1e5857125bec068d598de55e2387f7c5dc4e4d77ecf11ce23764"} Mar 20 08:38:07 crc kubenswrapper[4903]: I0320 08:38:07.398724 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" event={"ID":"75a183a7-68ce-40f4-b663-8d8845fedb36","Type":"ContainerStarted","Data":"7ec402112a6470cce72b3cb67e75f04d37effe9d933d710d5002a35553b1681c"} Mar 20 08:38:07 crc kubenswrapper[4903]: I0320 08:38:07.410910 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-nmw5s" podStartSLOduration=1.871944401 podStartE2EDuration="6.410893822s" podCreationTimestamp="2026-03-20 08:38:01 +0000 UTC" firstStartedPulling="2026-03-20 08:38:02.412761272 +0000 UTC m=+907.629661587" lastFinishedPulling="2026-03-20 08:38:06.951710673 +0000 UTC m=+912.168611008" observedRunningTime="2026-03-20 08:38:07.406941612 +0000 UTC m=+912.623841937" watchObservedRunningTime="2026-03-20 08:38:07.410893822 +0000 UTC m=+912.627794147" Mar 20 08:38:07 crc kubenswrapper[4903]: I0320 08:38:07.432732 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-f8dnk" podStartSLOduration=1.912517555 podStartE2EDuration="6.432497487s" podCreationTimestamp="2026-03-20 08:38:01 +0000 UTC" firstStartedPulling="2026-03-20 08:38:02.433896765 +0000 UTC m=+907.650797080" lastFinishedPulling="2026-03-20 08:38:06.953876687 +0000 UTC m=+912.170777012" observedRunningTime="2026-03-20 08:38:07.420946566 +0000 UTC m=+912.637846921" watchObservedRunningTime="2026-03-20 08:38:07.432497487 +0000 UTC m=+912.649397842" Mar 20 08:38:07 crc kubenswrapper[4903]: I0320 08:38:07.460547 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" podStartSLOduration=2.729556718 podStartE2EDuration="6.460523934s" podCreationTimestamp="2026-03-20 08:38:01 +0000 UTC" firstStartedPulling="2026-03-20 08:38:03.234232026 +0000 UTC m=+908.451132341" lastFinishedPulling="2026-03-20 08:38:06.965199192 +0000 UTC m=+912.182099557" observedRunningTime="2026-03-20 08:38:07.457715093 +0000 UTC m=+912.674615468" watchObservedRunningTime="2026-03-20 08:38:07.460523934 +0000 UTC m=+912.677424249" Mar 20 08:38:10 crc kubenswrapper[4903]: I0320 08:38:10.426155 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-df94z" event={"ID":"82dbfa21-7bc0-4311-bcb5-3f746b288130","Type":"ContainerStarted","Data":"1d4f97699f1e941de5a278f32c4509387ca7e2ffec387f1b86ee387041768a85"} Mar 20 08:38:10 crc kubenswrapper[4903]: I0320 08:38:10.448157 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-df94z" podStartSLOduration=2.275729713 podStartE2EDuration="9.448131s" podCreationTimestamp="2026-03-20 08:38:01 +0000 UTC" firstStartedPulling="2026-03-20 08:38:02.372886295 +0000 UTC m=+907.589786610" lastFinishedPulling="2026-03-20 08:38:09.545287582 +0000 UTC m=+914.762187897" observedRunningTime="2026-03-20 08:38:10.447946105 +0000 UTC m=+915.664846440" watchObservedRunningTime="2026-03-20 08:38:10.448131 +0000 UTC m=+915.665031335" Mar 20 08:38:12 crc kubenswrapper[4903]: I0320 08:38:12.422219 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:12 crc kubenswrapper[4903]: I0320 08:38:12.422659 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:12 crc kubenswrapper[4903]: I0320 08:38:12.425686 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-nmw5s" Mar 20 08:38:12 crc kubenswrapper[4903]: I0320 08:38:12.430546 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:12 crc kubenswrapper[4903]: I0320 08:38:12.451995 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8594cd6d8-w6ps6" Mar 20 08:38:12 crc kubenswrapper[4903]: I0320 08:38:12.552606 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-h4m4s"] Mar 20 08:38:20 crc kubenswrapper[4903]: I0320 08:38:20.833753 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:38:20 crc kubenswrapper[4903]: I0320 08:38:20.834453 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:38:22 crc kubenswrapper[4903]: I0320 08:38:22.679316 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mxxc9" Mar 20 08:38:37 crc kubenswrapper[4903]: I0320 08:38:37.623484 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-h4m4s" podUID="f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" containerName="console" containerID="cri-o://2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd" gracePeriod=15 Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.044472 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-h4m4s_f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea/console/0.log" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.044572 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.110442 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-oauth-config\") pod \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.110525 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-oauth-serving-cert\") pod \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.110543 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-trusted-ca-bundle\") pod \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.110581 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfhhr\" (UniqueName: \"kubernetes.io/projected/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-kube-api-access-qfhhr\") pod \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.110605 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-serving-cert\") pod \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.110632 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-config\") pod \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.110684 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-service-ca\") pod \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\" (UID: \"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea\") " Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.112249 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-service-ca" (OuterVolumeSpecName: "service-ca") pod "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" (UID: "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.112303 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" (UID: "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.112403 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-config" (OuterVolumeSpecName: "console-config") pod "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" (UID: "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.112454 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" (UID: "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.120313 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" (UID: "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.120408 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-kube-api-access-qfhhr" (OuterVolumeSpecName: "kube-api-access-qfhhr") pod "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" (UID: "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea"). InnerVolumeSpecName "kube-api-access-qfhhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.121317 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" (UID: "f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.212410 4903 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.212931 4903 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.212959 4903 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.212976 4903 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.212991 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfhhr\" (UniqueName: \"kubernetes.io/projected/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-kube-api-access-qfhhr\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.213003 4903 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.213013 4903 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.653434 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-h4m4s_f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea/console/0.log" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.653520 4903 generic.go:334] "Generic (PLEG): container finished" podID="f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" containerID="2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd" exitCode=2 Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.653568 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h4m4s" event={"ID":"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea","Type":"ContainerDied","Data":"2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd"} Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.653614 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-h4m4s" event={"ID":"f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea","Type":"ContainerDied","Data":"5198347c957344dd43acf7dfce9028028af1666b0835e4447e857ba5daad5ac6"} Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.653643 4903 scope.go:117] "RemoveContainer" containerID="2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.653829 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-h4m4s" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.699935 4903 scope.go:117] "RemoveContainer" containerID="2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd" Mar 20 08:38:38 crc kubenswrapper[4903]: E0320 08:38:38.705897 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd\": container with ID starting with 2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd not found: ID does not exist" containerID="2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.705962 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd"} err="failed to get container status \"2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd\": rpc error: code = NotFound desc = could not find container \"2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd\": container with ID starting with 2d1f32e81e75520ed2f0204f266642cc821238ba59691faa6f7278ed8f9124dd not found: ID does not exist" Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.713540 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-h4m4s"] Mar 20 08:38:38 crc kubenswrapper[4903]: I0320 08:38:38.718818 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-h4m4s"] Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.266094 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr"] Mar 20 08:38:39 crc kubenswrapper[4903]: E0320 08:38:39.266609 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" containerName="console" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.266703 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" containerName="console" Mar 20 08:38:39 crc kubenswrapper[4903]: E0320 08:38:39.266795 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04aba9e4-217f-489a-9c45-dde90c447aa6" containerName="oc" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.266864 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="04aba9e4-217f-489a-9c45-dde90c447aa6" containerName="oc" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.267128 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" containerName="console" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.267229 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="04aba9e4-217f-489a-9c45-dde90c447aa6" containerName="oc" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.268265 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.270808 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.283348 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr"] Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.330060 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xwqq\" (UniqueName: \"kubernetes.io/projected/d285db6b-c23c-4ad7-90e8-833756b96ec5-kube-api-access-7xwqq\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.330147 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.330196 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.431600 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.431972 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.432075 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xwqq\" (UniqueName: \"kubernetes.io/projected/d285db6b-c23c-4ad7-90e8-833756b96ec5-kube-api-access-7xwqq\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.432294 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.432757 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.462562 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xwqq\" (UniqueName: \"kubernetes.io/projected/d285db6b-c23c-4ad7-90e8-833756b96ec5-kube-api-access-7xwqq\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.504276 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea" path="/var/lib/kubelet/pods/f5e9a2fd-a5ab-4db3-9363-e4e94745c6ea/volumes" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.585436 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:39 crc kubenswrapper[4903]: I0320 08:38:39.830554 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr"] Mar 20 08:38:40 crc kubenswrapper[4903]: I0320 08:38:40.672245 4903 generic.go:334] "Generic (PLEG): container finished" podID="d285db6b-c23c-4ad7-90e8-833756b96ec5" containerID="2274d21cb0c33d0d8a090f46feb867fd3ace2d3e28cff3045e42c3ba9c228b54" exitCode=0 Mar 20 08:38:40 crc kubenswrapper[4903]: I0320 08:38:40.672390 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" event={"ID":"d285db6b-c23c-4ad7-90e8-833756b96ec5","Type":"ContainerDied","Data":"2274d21cb0c33d0d8a090f46feb867fd3ace2d3e28cff3045e42c3ba9c228b54"} Mar 20 08:38:40 crc kubenswrapper[4903]: I0320 08:38:40.673382 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" event={"ID":"d285db6b-c23c-4ad7-90e8-833756b96ec5","Type":"ContainerStarted","Data":"58f3c85144d0cb5a4374bdff4c7d93c8f174d5cf34049a19cf5feccf47bc1650"} Mar 20 08:38:42 crc kubenswrapper[4903]: I0320 08:38:42.690706 4903 generic.go:334] "Generic (PLEG): container finished" podID="d285db6b-c23c-4ad7-90e8-833756b96ec5" containerID="45647723d96a93137838af984790c69d768b7057db8646a49f3d458941ca047a" exitCode=0 Mar 20 08:38:42 crc kubenswrapper[4903]: I0320 08:38:42.690785 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" event={"ID":"d285db6b-c23c-4ad7-90e8-833756b96ec5","Type":"ContainerDied","Data":"45647723d96a93137838af984790c69d768b7057db8646a49f3d458941ca047a"} Mar 20 08:38:43 crc kubenswrapper[4903]: I0320 08:38:43.703478 4903 generic.go:334] "Generic (PLEG): container finished" podID="d285db6b-c23c-4ad7-90e8-833756b96ec5" containerID="95fbd2323671d04673df011ae1c623bcd876b38ba9c10c414d3a5f72c08ec403" exitCode=0 Mar 20 08:38:43 crc kubenswrapper[4903]: I0320 08:38:43.703621 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" event={"ID":"d285db6b-c23c-4ad7-90e8-833756b96ec5","Type":"ContainerDied","Data":"95fbd2323671d04673df011ae1c623bcd876b38ba9c10c414d3a5f72c08ec403"} Mar 20 08:38:44 crc kubenswrapper[4903]: I0320 08:38:44.996335 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.020466 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-util\") pod \"d285db6b-c23c-4ad7-90e8-833756b96ec5\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.020529 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-bundle\") pod \"d285db6b-c23c-4ad7-90e8-833756b96ec5\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.020606 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xwqq\" (UniqueName: \"kubernetes.io/projected/d285db6b-c23c-4ad7-90e8-833756b96ec5-kube-api-access-7xwqq\") pod \"d285db6b-c23c-4ad7-90e8-833756b96ec5\" (UID: \"d285db6b-c23c-4ad7-90e8-833756b96ec5\") " Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.025322 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-bundle" (OuterVolumeSpecName: "bundle") pod "d285db6b-c23c-4ad7-90e8-833756b96ec5" (UID: "d285db6b-c23c-4ad7-90e8-833756b96ec5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.029813 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d285db6b-c23c-4ad7-90e8-833756b96ec5-kube-api-access-7xwqq" (OuterVolumeSpecName: "kube-api-access-7xwqq") pod "d285db6b-c23c-4ad7-90e8-833756b96ec5" (UID: "d285db6b-c23c-4ad7-90e8-833756b96ec5"). InnerVolumeSpecName "kube-api-access-7xwqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.045835 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-util" (OuterVolumeSpecName: "util") pod "d285db6b-c23c-4ad7-90e8-833756b96ec5" (UID: "d285db6b-c23c-4ad7-90e8-833756b96ec5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.122199 4903 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-util\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.122236 4903 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d285db6b-c23c-4ad7-90e8-833756b96ec5-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.122248 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xwqq\" (UniqueName: \"kubernetes.io/projected/d285db6b-c23c-4ad7-90e8-833756b96ec5-kube-api-access-7xwqq\") on node \"crc\" DevicePath \"\"" Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.721428 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" event={"ID":"d285db6b-c23c-4ad7-90e8-833756b96ec5","Type":"ContainerDied","Data":"58f3c85144d0cb5a4374bdff4c7d93c8f174d5cf34049a19cf5feccf47bc1650"} Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.721497 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58f3c85144d0cb5a4374bdff4c7d93c8f174d5cf34049a19cf5feccf47bc1650" Mar 20 08:38:45 crc kubenswrapper[4903]: I0320 08:38:45.721561 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr" Mar 20 08:38:50 crc kubenswrapper[4903]: I0320 08:38:50.833786 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:38:50 crc kubenswrapper[4903]: I0320 08:38:50.834756 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:38:50 crc kubenswrapper[4903]: I0320 08:38:50.834837 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:38:50 crc kubenswrapper[4903]: I0320 08:38:50.835717 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d4eaa665a94ad4629d1882b24e933477ef11d2d020096b0cd7d0be400cb4301"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:38:50 crc kubenswrapper[4903]: I0320 08:38:50.835789 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://5d4eaa665a94ad4629d1882b24e933477ef11d2d020096b0cd7d0be400cb4301" gracePeriod=600 Mar 20 08:38:51 crc kubenswrapper[4903]: I0320 08:38:51.761552 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="5d4eaa665a94ad4629d1882b24e933477ef11d2d020096b0cd7d0be400cb4301" exitCode=0 Mar 20 08:38:51 crc kubenswrapper[4903]: I0320 08:38:51.761642 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"5d4eaa665a94ad4629d1882b24e933477ef11d2d020096b0cd7d0be400cb4301"} Mar 20 08:38:51 crc kubenswrapper[4903]: I0320 08:38:51.762060 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"0c24277c1ea9806e81aa4981e7afee5bd67d933c16e4a264dbdf97f39e69ac1c"} Mar 20 08:38:51 crc kubenswrapper[4903]: I0320 08:38:51.762115 4903 scope.go:117] "RemoveContainer" containerID="d4e808adab9ca608f4234405ae0edc403a12d33d968b79be08acd008cc246023" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.709660 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d"] Mar 20 08:38:54 crc kubenswrapper[4903]: E0320 08:38:54.710765 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d285db6b-c23c-4ad7-90e8-833756b96ec5" containerName="util" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.710782 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d285db6b-c23c-4ad7-90e8-833756b96ec5" containerName="util" Mar 20 08:38:54 crc kubenswrapper[4903]: E0320 08:38:54.710809 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d285db6b-c23c-4ad7-90e8-833756b96ec5" containerName="pull" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.710816 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d285db6b-c23c-4ad7-90e8-833756b96ec5" containerName="pull" Mar 20 08:38:54 crc kubenswrapper[4903]: E0320 08:38:54.710824 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d285db6b-c23c-4ad7-90e8-833756b96ec5" containerName="extract" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.710831 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d285db6b-c23c-4ad7-90e8-833756b96ec5" containerName="extract" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.710936 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d285db6b-c23c-4ad7-90e8-833756b96ec5" containerName="extract" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.711440 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.714820 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.715985 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.716120 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.716120 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.718466 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hxq8k" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.736481 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d"] Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.780443 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbfmm\" (UniqueName: \"kubernetes.io/projected/a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d-kube-api-access-kbfmm\") pod \"metallb-operator-controller-manager-64f5dc6cf7-q7h6d\" (UID: \"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d\") " pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.780720 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d-webhook-cert\") pod \"metallb-operator-controller-manager-64f5dc6cf7-q7h6d\" (UID: \"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d\") " pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.780806 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d-apiservice-cert\") pod \"metallb-operator-controller-manager-64f5dc6cf7-q7h6d\" (UID: \"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d\") " pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.882146 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d-webhook-cert\") pod \"metallb-operator-controller-manager-64f5dc6cf7-q7h6d\" (UID: \"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d\") " pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.882215 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d-apiservice-cert\") pod \"metallb-operator-controller-manager-64f5dc6cf7-q7h6d\" (UID: \"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d\") " pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.882298 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbfmm\" (UniqueName: \"kubernetes.io/projected/a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d-kube-api-access-kbfmm\") pod \"metallb-operator-controller-manager-64f5dc6cf7-q7h6d\" (UID: \"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d\") " pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.891904 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d-apiservice-cert\") pod \"metallb-operator-controller-manager-64f5dc6cf7-q7h6d\" (UID: \"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d\") " pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.892157 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d-webhook-cert\") pod \"metallb-operator-controller-manager-64f5dc6cf7-q7h6d\" (UID: \"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d\") " pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:54 crc kubenswrapper[4903]: I0320 08:38:54.904691 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbfmm\" (UniqueName: \"kubernetes.io/projected/a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d-kube-api-access-kbfmm\") pod \"metallb-operator-controller-manager-64f5dc6cf7-q7h6d\" (UID: \"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d\") " pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.028251 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.189207 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx"] Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.194495 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.200677 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.201157 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.201326 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-kxxd8" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.204014 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx"] Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.291059 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7fb8f81-902b-4e48-b0d3-a7404373c5af-webhook-cert\") pod \"metallb-operator-webhook-server-5f5d9d49d6-jzxvx\" (UID: \"a7fb8f81-902b-4e48-b0d3-a7404373c5af\") " pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.291514 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9gt9\" (UniqueName: \"kubernetes.io/projected/a7fb8f81-902b-4e48-b0d3-a7404373c5af-kube-api-access-r9gt9\") pod \"metallb-operator-webhook-server-5f5d9d49d6-jzxvx\" (UID: \"a7fb8f81-902b-4e48-b0d3-a7404373c5af\") " pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.291544 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7fb8f81-902b-4e48-b0d3-a7404373c5af-apiservice-cert\") pod \"metallb-operator-webhook-server-5f5d9d49d6-jzxvx\" (UID: \"a7fb8f81-902b-4e48-b0d3-a7404373c5af\") " pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.344453 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d"] Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.392585 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7fb8f81-902b-4e48-b0d3-a7404373c5af-webhook-cert\") pod \"metallb-operator-webhook-server-5f5d9d49d6-jzxvx\" (UID: \"a7fb8f81-902b-4e48-b0d3-a7404373c5af\") " pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.392682 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9gt9\" (UniqueName: \"kubernetes.io/projected/a7fb8f81-902b-4e48-b0d3-a7404373c5af-kube-api-access-r9gt9\") pod \"metallb-operator-webhook-server-5f5d9d49d6-jzxvx\" (UID: \"a7fb8f81-902b-4e48-b0d3-a7404373c5af\") " pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.392713 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7fb8f81-902b-4e48-b0d3-a7404373c5af-apiservice-cert\") pod \"metallb-operator-webhook-server-5f5d9d49d6-jzxvx\" (UID: \"a7fb8f81-902b-4e48-b0d3-a7404373c5af\") " pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.394733 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.412659 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7fb8f81-902b-4e48-b0d3-a7404373c5af-apiservice-cert\") pod \"metallb-operator-webhook-server-5f5d9d49d6-jzxvx\" (UID: \"a7fb8f81-902b-4e48-b0d3-a7404373c5af\") " pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.412777 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7fb8f81-902b-4e48-b0d3-a7404373c5af-webhook-cert\") pod \"metallb-operator-webhook-server-5f5d9d49d6-jzxvx\" (UID: \"a7fb8f81-902b-4e48-b0d3-a7404373c5af\") " pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.412852 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9gt9\" (UniqueName: \"kubernetes.io/projected/a7fb8f81-902b-4e48-b0d3-a7404373c5af-kube-api-access-r9gt9\") pod \"metallb-operator-webhook-server-5f5d9d49d6-jzxvx\" (UID: \"a7fb8f81-902b-4e48-b0d3-a7404373c5af\") " pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.568404 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-kxxd8" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.568566 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:38:55 crc kubenswrapper[4903]: I0320 08:38:55.795526 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" event={"ID":"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d","Type":"ContainerStarted","Data":"6ab5c59cefdce524f1f68537e28220c2b242433b3b6f108684e24061c850339e"} Mar 20 08:38:56 crc kubenswrapper[4903]: I0320 08:38:56.057835 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx"] Mar 20 08:38:56 crc kubenswrapper[4903]: W0320 08:38:56.070866 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7fb8f81_902b_4e48_b0d3_a7404373c5af.slice/crio-590f8cff23c61e2e1970370337fdf855b39b5a06905a9accf5c921b93bf0e107 WatchSource:0}: Error finding container 590f8cff23c61e2e1970370337fdf855b39b5a06905a9accf5c921b93bf0e107: Status 404 returned error can't find the container with id 590f8cff23c61e2e1970370337fdf855b39b5a06905a9accf5c921b93bf0e107 Mar 20 08:38:56 crc kubenswrapper[4903]: I0320 08:38:56.712340 4903 scope.go:117] "RemoveContainer" containerID="096b2f20a01d60e5013375f69c38f59a73a6f708def95ccb0592a5e2ce5f7afd" Mar 20 08:38:56 crc kubenswrapper[4903]: I0320 08:38:56.807849 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" event={"ID":"a7fb8f81-902b-4e48-b0d3-a7404373c5af","Type":"ContainerStarted","Data":"590f8cff23c61e2e1970370337fdf855b39b5a06905a9accf5c921b93bf0e107"} Mar 20 08:39:00 crc kubenswrapper[4903]: I0320 08:39:00.848998 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" event={"ID":"a7fb8f81-902b-4e48-b0d3-a7404373c5af","Type":"ContainerStarted","Data":"f8f9c0cce495cb4fc8a51ecdc0d4b8ce20af9f395870e68e88af32b9eb070c43"} Mar 20 08:39:00 crc kubenswrapper[4903]: I0320 08:39:00.849948 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:39:00 crc kubenswrapper[4903]: I0320 08:39:00.854438 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" event={"ID":"a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d","Type":"ContainerStarted","Data":"352c413a15d17af76e2c7b8da8f4056bad25039431b14b93ff4d0f5c4ed0748e"} Mar 20 08:39:00 crc kubenswrapper[4903]: I0320 08:39:00.854682 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:39:00 crc kubenswrapper[4903]: I0320 08:39:00.878056 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" podStartSLOduration=1.294534898 podStartE2EDuration="5.878019163s" podCreationTimestamp="2026-03-20 08:38:55 +0000 UTC" firstStartedPulling="2026-03-20 08:38:56.075163991 +0000 UTC m=+961.292064306" lastFinishedPulling="2026-03-20 08:39:00.658648226 +0000 UTC m=+965.875548571" observedRunningTime="2026-03-20 08:39:00.873303813 +0000 UTC m=+966.090204128" watchObservedRunningTime="2026-03-20 08:39:00.878019163 +0000 UTC m=+966.094919468" Mar 20 08:39:00 crc kubenswrapper[4903]: I0320 08:39:00.901201 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" podStartSLOduration=1.61275122 podStartE2EDuration="6.901177447s" podCreationTimestamp="2026-03-20 08:38:54 +0000 UTC" firstStartedPulling="2026-03-20 08:38:55.362557845 +0000 UTC m=+960.579458160" lastFinishedPulling="2026-03-20 08:39:00.650984072 +0000 UTC m=+965.867884387" observedRunningTime="2026-03-20 08:39:00.896205641 +0000 UTC m=+966.113105966" watchObservedRunningTime="2026-03-20 08:39:00.901177447 +0000 UTC m=+966.118077772" Mar 20 08:39:15 crc kubenswrapper[4903]: I0320 08:39:15.575908 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5f5d9d49d6-jzxvx" Mar 20 08:39:35 crc kubenswrapper[4903]: I0320 08:39:35.033512 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-64f5dc6cf7-q7h6d" Mar 20 08:39:35 crc kubenswrapper[4903]: I0320 08:39:35.934224 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-lds62"] Mar 20 08:39:35 crc kubenswrapper[4903]: I0320 08:39:35.936878 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:35 crc kubenswrapper[4903]: I0320 08:39:35.939506 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 08:39:35 crc kubenswrapper[4903]: I0320 08:39:35.939829 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 08:39:35 crc kubenswrapper[4903]: I0320 08:39:35.940414 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9hscf" Mar 20 08:39:35 crc kubenswrapper[4903]: I0320 08:39:35.967645 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569"] Mar 20 08:39:35 crc kubenswrapper[4903]: I0320 08:39:35.968561 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" Mar 20 08:39:35 crc kubenswrapper[4903]: I0320 08:39:35.971153 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 08:39:35 crc kubenswrapper[4903]: I0320 08:39:35.998211 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569"] Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.030996 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jws2v"] Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.031906 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.033556 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.033629 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-ktpkv" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.035312 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.035598 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.051563 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msq2g\" (UniqueName: \"kubernetes.io/projected/af40b4fc-a05d-4df1-a47e-0d316a679275-kube-api-access-msq2g\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.051627 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-frr-sockets\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.051677 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af40b4fc-a05d-4df1-a47e-0d316a679275-metrics-certs\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.051704 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/af40b4fc-a05d-4df1-a47e-0d316a679275-frr-startup\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.051722 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-reloader\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.051748 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-metrics\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.051767 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-frr-conf\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.051910 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99j26\" (UniqueName: \"kubernetes.io/projected/f09db154-41d6-4b8f-a224-727c22b90f78-kube-api-access-99j26\") pod \"frr-k8s-webhook-server-bcc4b6f68-k9569\" (UID: \"f09db154-41d6-4b8f-a224-727c22b90f78\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.051961 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09db154-41d6-4b8f-a224-727c22b90f78-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-k9569\" (UID: \"f09db154-41d6-4b8f-a224-727c22b90f78\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.061696 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-kqcq5"] Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.062912 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.065452 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.077647 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-kqcq5"] Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153418 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-metrics\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153468 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmmlm\" (UniqueName: \"kubernetes.io/projected/0b93abcf-779f-44af-b566-b66a7052ceef-kube-api-access-jmmlm\") pod \"controller-7bb4cc7c98-kqcq5\" (UID: \"0b93abcf-779f-44af-b566-b66a7052ceef\") " pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153496 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b93abcf-779f-44af-b566-b66a7052ceef-metrics-certs\") pod \"controller-7bb4cc7c98-kqcq5\" (UID: \"0b93abcf-779f-44af-b566-b66a7052ceef\") " pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153519 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-frr-conf\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153555 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99j26\" (UniqueName: \"kubernetes.io/projected/f09db154-41d6-4b8f-a224-727c22b90f78-kube-api-access-99j26\") pod \"frr-k8s-webhook-server-bcc4b6f68-k9569\" (UID: \"f09db154-41d6-4b8f-a224-727c22b90f78\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153575 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09db154-41d6-4b8f-a224-727c22b90f78-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-k9569\" (UID: \"f09db154-41d6-4b8f-a224-727c22b90f78\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153599 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7ssp\" (UniqueName: \"kubernetes.io/projected/5ea82220-7527-4132-bd06-3db8c79850d3-kube-api-access-d7ssp\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153617 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-metrics-certs\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153640 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ea82220-7527-4132-bd06-3db8c79850d3-metallb-excludel2\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153661 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msq2g\" (UniqueName: \"kubernetes.io/projected/af40b4fc-a05d-4df1-a47e-0d316a679275-kube-api-access-msq2g\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153681 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-memberlist\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153703 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-frr-sockets\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153734 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af40b4fc-a05d-4df1-a47e-0d316a679275-metrics-certs\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153754 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/af40b4fc-a05d-4df1-a47e-0d316a679275-frr-startup\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153788 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-reloader\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.153808 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b93abcf-779f-44af-b566-b66a7052ceef-cert\") pod \"controller-7bb4cc7c98-kqcq5\" (UID: \"0b93abcf-779f-44af-b566-b66a7052ceef\") " pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.154066 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-metrics\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.154174 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-frr-conf\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.154366 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-frr-sockets\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.155502 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/af40b4fc-a05d-4df1-a47e-0d316a679275-reloader\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.156297 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/af40b4fc-a05d-4df1-a47e-0d316a679275-frr-startup\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.161773 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af40b4fc-a05d-4df1-a47e-0d316a679275-metrics-certs\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.163481 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f09db154-41d6-4b8f-a224-727c22b90f78-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-k9569\" (UID: \"f09db154-41d6-4b8f-a224-727c22b90f78\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.180265 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99j26\" (UniqueName: \"kubernetes.io/projected/f09db154-41d6-4b8f-a224-727c22b90f78-kube-api-access-99j26\") pod \"frr-k8s-webhook-server-bcc4b6f68-k9569\" (UID: \"f09db154-41d6-4b8f-a224-727c22b90f78\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.188489 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msq2g\" (UniqueName: \"kubernetes.io/projected/af40b4fc-a05d-4df1-a47e-0d316a679275-kube-api-access-msq2g\") pod \"frr-k8s-lds62\" (UID: \"af40b4fc-a05d-4df1-a47e-0d316a679275\") " pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.254824 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b93abcf-779f-44af-b566-b66a7052ceef-cert\") pod \"controller-7bb4cc7c98-kqcq5\" (UID: \"0b93abcf-779f-44af-b566-b66a7052ceef\") " pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.254919 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmlm\" (UniqueName: \"kubernetes.io/projected/0b93abcf-779f-44af-b566-b66a7052ceef-kube-api-access-jmmlm\") pod \"controller-7bb4cc7c98-kqcq5\" (UID: \"0b93abcf-779f-44af-b566-b66a7052ceef\") " pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.254940 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b93abcf-779f-44af-b566-b66a7052ceef-metrics-certs\") pod \"controller-7bb4cc7c98-kqcq5\" (UID: \"0b93abcf-779f-44af-b566-b66a7052ceef\") " pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.254987 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7ssp\" (UniqueName: \"kubernetes.io/projected/5ea82220-7527-4132-bd06-3db8c79850d3-kube-api-access-d7ssp\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.255006 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-metrics-certs\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.255052 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ea82220-7527-4132-bd06-3db8c79850d3-metallb-excludel2\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.255072 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-memberlist\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: E0320 08:39:36.255211 4903 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 08:39:36 crc kubenswrapper[4903]: E0320 08:39:36.255266 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-memberlist podName:5ea82220-7527-4132-bd06-3db8c79850d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:39:36.755246141 +0000 UTC m=+1001.972146456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-memberlist") pod "speaker-jws2v" (UID: "5ea82220-7527-4132-bd06-3db8c79850d3") : secret "metallb-memberlist" not found Mar 20 08:39:36 crc kubenswrapper[4903]: E0320 08:39:36.255412 4903 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 08:39:36 crc kubenswrapper[4903]: E0320 08:39:36.255530 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-metrics-certs podName:5ea82220-7527-4132-bd06-3db8c79850d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:39:36.755498858 +0000 UTC m=+1001.972399203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-metrics-certs") pod "speaker-jws2v" (UID: "5ea82220-7527-4132-bd06-3db8c79850d3") : secret "speaker-certs-secret" not found Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.256371 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5ea82220-7527-4132-bd06-3db8c79850d3-metallb-excludel2\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.257283 4903 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.257722 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.259740 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b93abcf-779f-44af-b566-b66a7052ceef-metrics-certs\") pod \"controller-7bb4cc7c98-kqcq5\" (UID: \"0b93abcf-779f-44af-b566-b66a7052ceef\") " pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.270305 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7ssp\" (UniqueName: \"kubernetes.io/projected/5ea82220-7527-4132-bd06-3db8c79850d3-kube-api-access-d7ssp\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.272510 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b93abcf-779f-44af-b566-b66a7052ceef-cert\") pod \"controller-7bb4cc7c98-kqcq5\" (UID: \"0b93abcf-779f-44af-b566-b66a7052ceef\") " pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.278143 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmmlm\" (UniqueName: \"kubernetes.io/projected/0b93abcf-779f-44af-b566-b66a7052ceef-kube-api-access-jmmlm\") pod \"controller-7bb4cc7c98-kqcq5\" (UID: \"0b93abcf-779f-44af-b566-b66a7052ceef\") " pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.283436 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.375220 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.627077 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569"] Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.769271 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-metrics-certs\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.769481 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-memberlist\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: E0320 08:39:36.769618 4903 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 08:39:36 crc kubenswrapper[4903]: E0320 08:39:36.769690 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-memberlist podName:5ea82220-7527-4132-bd06-3db8c79850d3 nodeName:}" failed. No retries permitted until 2026-03-20 08:39:37.769671582 +0000 UTC m=+1002.986571897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-memberlist") pod "speaker-jws2v" (UID: "5ea82220-7527-4132-bd06-3db8c79850d3") : secret "metallb-memberlist" not found Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.782267 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-metrics-certs\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:36 crc kubenswrapper[4903]: I0320 08:39:36.919650 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-kqcq5"] Mar 20 08:39:36 crc kubenswrapper[4903]: W0320 08:39:36.921956 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b93abcf_779f_44af_b566_b66a7052ceef.slice/crio-81d5516cf5749e83fae486a15012e2a05da90a1bca2a227a2f54811458db24a6 WatchSource:0}: Error finding container 81d5516cf5749e83fae486a15012e2a05da90a1bca2a227a2f54811458db24a6: Status 404 returned error can't find the container with id 81d5516cf5749e83fae486a15012e2a05da90a1bca2a227a2f54811458db24a6 Mar 20 08:39:37 crc kubenswrapper[4903]: I0320 08:39:37.155968 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kqcq5" event={"ID":"0b93abcf-779f-44af-b566-b66a7052ceef","Type":"ContainerStarted","Data":"ac42d0f8383e923e9b408e0f319e79fad4d3ac3ad97c0f1795584a50de8ee1f4"} Mar 20 08:39:37 crc kubenswrapper[4903]: I0320 08:39:37.158000 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kqcq5" event={"ID":"0b93abcf-779f-44af-b566-b66a7052ceef","Type":"ContainerStarted","Data":"81d5516cf5749e83fae486a15012e2a05da90a1bca2a227a2f54811458db24a6"} Mar 20 08:39:37 crc kubenswrapper[4903]: I0320 08:39:37.158973 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" event={"ID":"f09db154-41d6-4b8f-a224-727c22b90f78","Type":"ContainerStarted","Data":"7d57c7c895ce7f8070fc390dd6c8d7f662bbf004fc74e13c7f9a26f6c2067c51"} Mar 20 08:39:37 crc kubenswrapper[4903]: I0320 08:39:37.160534 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lds62" event={"ID":"af40b4fc-a05d-4df1-a47e-0d316a679275","Type":"ContainerStarted","Data":"fb0feba4a684c7abc020126eff25a43331ad39e928a9f732da186c4096f9bf8f"} Mar 20 08:39:37 crc kubenswrapper[4903]: I0320 08:39:37.784454 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-memberlist\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:37 crc kubenswrapper[4903]: I0320 08:39:37.799306 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5ea82220-7527-4132-bd06-3db8c79850d3-memberlist\") pod \"speaker-jws2v\" (UID: \"5ea82220-7527-4132-bd06-3db8c79850d3\") " pod="metallb-system/speaker-jws2v" Mar 20 08:39:37 crc kubenswrapper[4903]: I0320 08:39:37.845772 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jws2v" Mar 20 08:39:37 crc kubenswrapper[4903]: W0320 08:39:37.871871 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ea82220_7527_4132_bd06_3db8c79850d3.slice/crio-9cfe5601e663e04cad46802bfda1a928d81bf319489e67ab9ef2cba4113c3818 WatchSource:0}: Error finding container 9cfe5601e663e04cad46802bfda1a928d81bf319489e67ab9ef2cba4113c3818: Status 404 returned error can't find the container with id 9cfe5601e663e04cad46802bfda1a928d81bf319489e67ab9ef2cba4113c3818 Mar 20 08:39:38 crc kubenswrapper[4903]: I0320 08:39:38.171519 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jws2v" event={"ID":"5ea82220-7527-4132-bd06-3db8c79850d3","Type":"ContainerStarted","Data":"804a391a93853add52bef53e27bbaa0cad918b22908fcd4e54f544d4c4c800f9"} Mar 20 08:39:38 crc kubenswrapper[4903]: I0320 08:39:38.171968 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jws2v" event={"ID":"5ea82220-7527-4132-bd06-3db8c79850d3","Type":"ContainerStarted","Data":"9cfe5601e663e04cad46802bfda1a928d81bf319489e67ab9ef2cba4113c3818"} Mar 20 08:39:38 crc kubenswrapper[4903]: I0320 08:39:38.173641 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-kqcq5" event={"ID":"0b93abcf-779f-44af-b566-b66a7052ceef","Type":"ContainerStarted","Data":"554e448a5bda1235a8928446fde31e64cbe79fad3fb51e793251d5a6e1ac41c3"} Mar 20 08:39:38 crc kubenswrapper[4903]: I0320 08:39:38.173997 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:39 crc kubenswrapper[4903]: I0320 08:39:39.189764 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jws2v" event={"ID":"5ea82220-7527-4132-bd06-3db8c79850d3","Type":"ContainerStarted","Data":"c401ea2dee819946397e823f931dcaf648063fcff462b61e2bb1812071e5752e"} Mar 20 08:39:39 crc kubenswrapper[4903]: I0320 08:39:39.189941 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jws2v" Mar 20 08:39:39 crc kubenswrapper[4903]: I0320 08:39:39.220822 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jws2v" podStartSLOduration=3.220795969 podStartE2EDuration="3.220795969s" podCreationTimestamp="2026-03-20 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:39.216518406 +0000 UTC m=+1004.433418731" watchObservedRunningTime="2026-03-20 08:39:39.220795969 +0000 UTC m=+1004.437696284" Mar 20 08:39:39 crc kubenswrapper[4903]: I0320 08:39:39.222377 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-kqcq5" podStartSLOduration=3.222371028 podStartE2EDuration="3.222371028s" podCreationTimestamp="2026-03-20 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:39:38.192945482 +0000 UTC m=+1003.409845797" watchObservedRunningTime="2026-03-20 08:39:39.222371028 +0000 UTC m=+1004.439271353" Mar 20 08:39:44 crc kubenswrapper[4903]: I0320 08:39:44.235239 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" event={"ID":"f09db154-41d6-4b8f-a224-727c22b90f78","Type":"ContainerStarted","Data":"c73b805e2d7f1c66c496430036fafed80d55ab2511273edea01843d7771a2cdf"} Mar 20 08:39:44 crc kubenswrapper[4903]: I0320 08:39:44.238011 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" Mar 20 08:39:44 crc kubenswrapper[4903]: I0320 08:39:44.241605 4903 generic.go:334] "Generic (PLEG): container finished" podID="af40b4fc-a05d-4df1-a47e-0d316a679275" containerID="12bf743c95b3299b31fb63e04b19d4f45aec903d3687b6753f5d36b06f1e06ba" exitCode=0 Mar 20 08:39:44 crc kubenswrapper[4903]: I0320 08:39:44.241693 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lds62" event={"ID":"af40b4fc-a05d-4df1-a47e-0d316a679275","Type":"ContainerDied","Data":"12bf743c95b3299b31fb63e04b19d4f45aec903d3687b6753f5d36b06f1e06ba"} Mar 20 08:39:44 crc kubenswrapper[4903]: I0320 08:39:44.268848 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" podStartSLOduration=1.896871158 podStartE2EDuration="9.268820301s" podCreationTimestamp="2026-03-20 08:39:35 +0000 UTC" firstStartedPulling="2026-03-20 08:39:36.633356855 +0000 UTC m=+1001.850257170" lastFinishedPulling="2026-03-20 08:39:44.005305988 +0000 UTC m=+1009.222206313" observedRunningTime="2026-03-20 08:39:44.26552498 +0000 UTC m=+1009.482425335" watchObservedRunningTime="2026-03-20 08:39:44.268820301 +0000 UTC m=+1009.485720656" Mar 20 08:39:45 crc kubenswrapper[4903]: I0320 08:39:45.254696 4903 generic.go:334] "Generic (PLEG): container finished" podID="af40b4fc-a05d-4df1-a47e-0d316a679275" containerID="3221e723690bc99966273225841a63e803f57605046804b4052980ac5c3334c4" exitCode=0 Mar 20 08:39:45 crc kubenswrapper[4903]: I0320 08:39:45.254833 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lds62" event={"ID":"af40b4fc-a05d-4df1-a47e-0d316a679275","Type":"ContainerDied","Data":"3221e723690bc99966273225841a63e803f57605046804b4052980ac5c3334c4"} Mar 20 08:39:46 crc kubenswrapper[4903]: I0320 08:39:46.266240 4903 generic.go:334] "Generic (PLEG): container finished" podID="af40b4fc-a05d-4df1-a47e-0d316a679275" containerID="c35c437130e40cd996e529c30e72e1beeb21c8138b0af427d966323c6240397f" exitCode=0 Mar 20 08:39:46 crc kubenswrapper[4903]: I0320 08:39:46.266356 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lds62" event={"ID":"af40b4fc-a05d-4df1-a47e-0d316a679275","Type":"ContainerDied","Data":"c35c437130e40cd996e529c30e72e1beeb21c8138b0af427d966323c6240397f"} Mar 20 08:39:47 crc kubenswrapper[4903]: I0320 08:39:47.280947 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lds62" event={"ID":"af40b4fc-a05d-4df1-a47e-0d316a679275","Type":"ContainerStarted","Data":"380f8cef5f6f68428d15f021d3dc27a7009c410992c4eaf86f4c54eeb3bcfc42"} Mar 20 08:39:47 crc kubenswrapper[4903]: I0320 08:39:47.281466 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lds62" event={"ID":"af40b4fc-a05d-4df1-a47e-0d316a679275","Type":"ContainerStarted","Data":"fe86ed927b652d36fc749cfc75040c9ebaf219ccfb4f89461594e62e63a60140"} Mar 20 08:39:47 crc kubenswrapper[4903]: I0320 08:39:47.281498 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lds62" event={"ID":"af40b4fc-a05d-4df1-a47e-0d316a679275","Type":"ContainerStarted","Data":"2ddb28fbaaac29a661961aec886d902f0cab3d039a96dc09b9f593d2ec087080"} Mar 20 08:39:47 crc kubenswrapper[4903]: I0320 08:39:47.281524 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lds62" event={"ID":"af40b4fc-a05d-4df1-a47e-0d316a679275","Type":"ContainerStarted","Data":"48037bebcd47e2a2cffce709f3ecd5ed35d7037eb8fc70e052601dee190f6cf5"} Mar 20 08:39:47 crc kubenswrapper[4903]: I0320 08:39:47.281549 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lds62" event={"ID":"af40b4fc-a05d-4df1-a47e-0d316a679275","Type":"ContainerStarted","Data":"a0e7e4558baaa475932dcf6f0601224a6f0ee6bc3e1218aa2b373d9da784a54e"} Mar 20 08:39:48 crc kubenswrapper[4903]: I0320 08:39:48.296684 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lds62" event={"ID":"af40b4fc-a05d-4df1-a47e-0d316a679275","Type":"ContainerStarted","Data":"2a700cd0569092174ae714bc1ec210d94d06f18c45cc5f2c65b0dbc4759f4f19"} Mar 20 08:39:48 crc kubenswrapper[4903]: I0320 08:39:48.296918 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:48 crc kubenswrapper[4903]: I0320 08:39:48.340569 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-lds62" podStartSLOduration=5.862721869 podStartE2EDuration="13.340545881s" podCreationTimestamp="2026-03-20 08:39:35 +0000 UTC" firstStartedPulling="2026-03-20 08:39:36.503448794 +0000 UTC m=+1001.720349109" lastFinishedPulling="2026-03-20 08:39:43.981272806 +0000 UTC m=+1009.198173121" observedRunningTime="2026-03-20 08:39:48.336146904 +0000 UTC m=+1013.553047299" watchObservedRunningTime="2026-03-20 08:39:48.340545881 +0000 UTC m=+1013.557446206" Mar 20 08:39:51 crc kubenswrapper[4903]: I0320 08:39:51.258083 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:51 crc kubenswrapper[4903]: I0320 08:39:51.313273 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:56 crc kubenswrapper[4903]: I0320 08:39:56.263084 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-lds62" Mar 20 08:39:56 crc kubenswrapper[4903]: I0320 08:39:56.291274 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-k9569" Mar 20 08:39:56 crc kubenswrapper[4903]: I0320 08:39:56.382224 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-kqcq5" Mar 20 08:39:57 crc kubenswrapper[4903]: I0320 08:39:57.856308 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jws2v" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.618509 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc"] Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.619849 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.622164 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.632584 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc"] Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.664391 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.664834 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.665003 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xp4q\" (UniqueName: \"kubernetes.io/projected/00f81c35-6107-4e09-982a-ad82eef8735b-kube-api-access-5xp4q\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.766961 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.767595 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xp4q\" (UniqueName: \"kubernetes.io/projected/00f81c35-6107-4e09-982a-ad82eef8735b-kube-api-access-5xp4q\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.767973 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.767991 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.769083 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.791914 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xp4q\" (UniqueName: \"kubernetes.io/projected/00f81c35-6107-4e09-982a-ad82eef8735b-kube-api-access-5xp4q\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:39:59 crc kubenswrapper[4903]: I0320 08:39:59.940918 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.152806 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566600-hz2sg"] Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.153643 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566600-hz2sg" Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.156141 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.158008 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.158168 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.177531 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrhhd\" (UniqueName: \"kubernetes.io/projected/5bac3c78-5b74-45b3-99a8-72d6f8b75d6c-kube-api-access-lrhhd\") pod \"auto-csr-approver-29566600-hz2sg\" (UID: \"5bac3c78-5b74-45b3-99a8-72d6f8b75d6c\") " pod="openshift-infra/auto-csr-approver-29566600-hz2sg" Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.180763 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566600-hz2sg"] Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.279634 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrhhd\" (UniqueName: \"kubernetes.io/projected/5bac3c78-5b74-45b3-99a8-72d6f8b75d6c-kube-api-access-lrhhd\") pod \"auto-csr-approver-29566600-hz2sg\" (UID: \"5bac3c78-5b74-45b3-99a8-72d6f8b75d6c\") " pod="openshift-infra/auto-csr-approver-29566600-hz2sg" Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.308979 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrhhd\" (UniqueName: \"kubernetes.io/projected/5bac3c78-5b74-45b3-99a8-72d6f8b75d6c-kube-api-access-lrhhd\") pod \"auto-csr-approver-29566600-hz2sg\" (UID: \"5bac3c78-5b74-45b3-99a8-72d6f8b75d6c\") " pod="openshift-infra/auto-csr-approver-29566600-hz2sg" Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.313315 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc"] Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.399865 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" event={"ID":"00f81c35-6107-4e09-982a-ad82eef8735b","Type":"ContainerStarted","Data":"bd3a40241759651bff8671b5735ca35540e9419bb132ecfd1a93dec577a07080"} Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.492771 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566600-hz2sg" Mar 20 08:40:00 crc kubenswrapper[4903]: I0320 08:40:00.734562 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566600-hz2sg"] Mar 20 08:40:00 crc kubenswrapper[4903]: W0320 08:40:00.749961 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bac3c78_5b74_45b3_99a8_72d6f8b75d6c.slice/crio-5ed473ce0ead6339591a2b36c1535848d3e97f50d7213f79607f136665b499aa WatchSource:0}: Error finding container 5ed473ce0ead6339591a2b36c1535848d3e97f50d7213f79607f136665b499aa: Status 404 returned error can't find the container with id 5ed473ce0ead6339591a2b36c1535848d3e97f50d7213f79607f136665b499aa Mar 20 08:40:01 crc kubenswrapper[4903]: I0320 08:40:01.411397 4903 generic.go:334] "Generic (PLEG): container finished" podID="00f81c35-6107-4e09-982a-ad82eef8735b" containerID="613c505ccb3d770ed83e609d02891ffc0c5ec4900e784cd50c4faa8abf0f8211" exitCode=0 Mar 20 08:40:01 crc kubenswrapper[4903]: I0320 08:40:01.411484 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" event={"ID":"00f81c35-6107-4e09-982a-ad82eef8735b","Type":"ContainerDied","Data":"613c505ccb3d770ed83e609d02891ffc0c5ec4900e784cd50c4faa8abf0f8211"} Mar 20 08:40:01 crc kubenswrapper[4903]: I0320 08:40:01.414799 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566600-hz2sg" event={"ID":"5bac3c78-5b74-45b3-99a8-72d6f8b75d6c","Type":"ContainerStarted","Data":"5ed473ce0ead6339591a2b36c1535848d3e97f50d7213f79607f136665b499aa"} Mar 20 08:40:02 crc kubenswrapper[4903]: I0320 08:40:02.430826 4903 generic.go:334] "Generic (PLEG): container finished" podID="5bac3c78-5b74-45b3-99a8-72d6f8b75d6c" containerID="a3f295ce1810316f363f8d39f44fa7b3cdd0c100d6a0e205aaefff5415b5eea8" exitCode=0 Mar 20 08:40:02 crc kubenswrapper[4903]: I0320 08:40:02.430905 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566600-hz2sg" event={"ID":"5bac3c78-5b74-45b3-99a8-72d6f8b75d6c","Type":"ContainerDied","Data":"a3f295ce1810316f363f8d39f44fa7b3cdd0c100d6a0e205aaefff5415b5eea8"} Mar 20 08:40:04 crc kubenswrapper[4903]: I0320 08:40:04.805259 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566600-hz2sg" Mar 20 08:40:04 crc kubenswrapper[4903]: I0320 08:40:04.851783 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrhhd\" (UniqueName: \"kubernetes.io/projected/5bac3c78-5b74-45b3-99a8-72d6f8b75d6c-kube-api-access-lrhhd\") pod \"5bac3c78-5b74-45b3-99a8-72d6f8b75d6c\" (UID: \"5bac3c78-5b74-45b3-99a8-72d6f8b75d6c\") " Mar 20 08:40:04 crc kubenswrapper[4903]: I0320 08:40:04.866236 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bac3c78-5b74-45b3-99a8-72d6f8b75d6c-kube-api-access-lrhhd" (OuterVolumeSpecName: "kube-api-access-lrhhd") pod "5bac3c78-5b74-45b3-99a8-72d6f8b75d6c" (UID: "5bac3c78-5b74-45b3-99a8-72d6f8b75d6c"). InnerVolumeSpecName "kube-api-access-lrhhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:40:04 crc kubenswrapper[4903]: I0320 08:40:04.954257 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrhhd\" (UniqueName: \"kubernetes.io/projected/5bac3c78-5b74-45b3-99a8-72d6f8b75d6c-kube-api-access-lrhhd\") on node \"crc\" DevicePath \"\"" Mar 20 08:40:05 crc kubenswrapper[4903]: I0320 08:40:05.457253 4903 generic.go:334] "Generic (PLEG): container finished" podID="00f81c35-6107-4e09-982a-ad82eef8735b" containerID="079915920f84dd58d50521e8557fb80a33a1d441b728db99d57cf9e4605c099a" exitCode=0 Mar 20 08:40:05 crc kubenswrapper[4903]: I0320 08:40:05.457370 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" event={"ID":"00f81c35-6107-4e09-982a-ad82eef8735b","Type":"ContainerDied","Data":"079915920f84dd58d50521e8557fb80a33a1d441b728db99d57cf9e4605c099a"} Mar 20 08:40:05 crc kubenswrapper[4903]: I0320 08:40:05.459584 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566600-hz2sg" event={"ID":"5bac3c78-5b74-45b3-99a8-72d6f8b75d6c","Type":"ContainerDied","Data":"5ed473ce0ead6339591a2b36c1535848d3e97f50d7213f79607f136665b499aa"} Mar 20 08:40:05 crc kubenswrapper[4903]: I0320 08:40:05.459620 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ed473ce0ead6339591a2b36c1535848d3e97f50d7213f79607f136665b499aa" Mar 20 08:40:05 crc kubenswrapper[4903]: I0320 08:40:05.459627 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566600-hz2sg" Mar 20 08:40:05 crc kubenswrapper[4903]: I0320 08:40:05.883459 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566594-drppl"] Mar 20 08:40:05 crc kubenswrapper[4903]: I0320 08:40:05.891349 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566594-drppl"] Mar 20 08:40:06 crc kubenswrapper[4903]: I0320 08:40:06.474246 4903 generic.go:334] "Generic (PLEG): container finished" podID="00f81c35-6107-4e09-982a-ad82eef8735b" containerID="318aa019ffed048fe70bd54a4a46ea480c18f03f29dc50d7b1d4f39fec4b3c69" exitCode=0 Mar 20 08:40:06 crc kubenswrapper[4903]: I0320 08:40:06.474369 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" event={"ID":"00f81c35-6107-4e09-982a-ad82eef8735b","Type":"ContainerDied","Data":"318aa019ffed048fe70bd54a4a46ea480c18f03f29dc50d7b1d4f39fec4b3c69"} Mar 20 08:40:07 crc kubenswrapper[4903]: I0320 08:40:07.506942 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2424a689-158b-42a5-805c-47dbf5dd3203" path="/var/lib/kubelet/pods/2424a689-158b-42a5-805c-47dbf5dd3203/volumes" Mar 20 08:40:07 crc kubenswrapper[4903]: I0320 08:40:07.878437 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:40:07 crc kubenswrapper[4903]: I0320 08:40:07.904778 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xp4q\" (UniqueName: \"kubernetes.io/projected/00f81c35-6107-4e09-982a-ad82eef8735b-kube-api-access-5xp4q\") pod \"00f81c35-6107-4e09-982a-ad82eef8735b\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " Mar 20 08:40:07 crc kubenswrapper[4903]: I0320 08:40:07.904889 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-util\") pod \"00f81c35-6107-4e09-982a-ad82eef8735b\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " Mar 20 08:40:07 crc kubenswrapper[4903]: I0320 08:40:07.904934 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-bundle\") pod \"00f81c35-6107-4e09-982a-ad82eef8735b\" (UID: \"00f81c35-6107-4e09-982a-ad82eef8735b\") " Mar 20 08:40:07 crc kubenswrapper[4903]: I0320 08:40:07.908481 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-bundle" (OuterVolumeSpecName: "bundle") pod "00f81c35-6107-4e09-982a-ad82eef8735b" (UID: "00f81c35-6107-4e09-982a-ad82eef8735b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:40:07 crc kubenswrapper[4903]: I0320 08:40:07.918759 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00f81c35-6107-4e09-982a-ad82eef8735b-kube-api-access-5xp4q" (OuterVolumeSpecName: "kube-api-access-5xp4q") pod "00f81c35-6107-4e09-982a-ad82eef8735b" (UID: "00f81c35-6107-4e09-982a-ad82eef8735b"). InnerVolumeSpecName "kube-api-access-5xp4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:40:07 crc kubenswrapper[4903]: I0320 08:40:07.944379 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-util" (OuterVolumeSpecName: "util") pod "00f81c35-6107-4e09-982a-ad82eef8735b" (UID: "00f81c35-6107-4e09-982a-ad82eef8735b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:40:08 crc kubenswrapper[4903]: I0320 08:40:08.006805 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xp4q\" (UniqueName: \"kubernetes.io/projected/00f81c35-6107-4e09-982a-ad82eef8735b-kube-api-access-5xp4q\") on node \"crc\" DevicePath \"\"" Mar 20 08:40:08 crc kubenswrapper[4903]: I0320 08:40:08.007168 4903 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-util\") on node \"crc\" DevicePath \"\"" Mar 20 08:40:08 crc kubenswrapper[4903]: I0320 08:40:08.007268 4903 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00f81c35-6107-4e09-982a-ad82eef8735b-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:40:08 crc kubenswrapper[4903]: I0320 08:40:08.496996 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" event={"ID":"00f81c35-6107-4e09-982a-ad82eef8735b","Type":"ContainerDied","Data":"bd3a40241759651bff8671b5735ca35540e9419bb132ecfd1a93dec577a07080"} Mar 20 08:40:08 crc kubenswrapper[4903]: I0320 08:40:08.497098 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd3a40241759651bff8671b5735ca35540e9419bb132ecfd1a93dec577a07080" Mar 20 08:40:08 crc kubenswrapper[4903]: I0320 08:40:08.497231 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.743692 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b"] Mar 20 08:40:12 crc kubenswrapper[4903]: E0320 08:40:12.744314 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f81c35-6107-4e09-982a-ad82eef8735b" containerName="pull" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.744330 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f81c35-6107-4e09-982a-ad82eef8735b" containerName="pull" Mar 20 08:40:12 crc kubenswrapper[4903]: E0320 08:40:12.744342 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f81c35-6107-4e09-982a-ad82eef8735b" containerName="util" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.744350 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f81c35-6107-4e09-982a-ad82eef8735b" containerName="util" Mar 20 08:40:12 crc kubenswrapper[4903]: E0320 08:40:12.744364 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f81c35-6107-4e09-982a-ad82eef8735b" containerName="extract" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.744372 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f81c35-6107-4e09-982a-ad82eef8735b" containerName="extract" Mar 20 08:40:12 crc kubenswrapper[4903]: E0320 08:40:12.744396 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bac3c78-5b74-45b3-99a8-72d6f8b75d6c" containerName="oc" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.744404 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bac3c78-5b74-45b3-99a8-72d6f8b75d6c" containerName="oc" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.744529 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bac3c78-5b74-45b3-99a8-72d6f8b75d6c" containerName="oc" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.744553 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="00f81c35-6107-4e09-982a-ad82eef8735b" containerName="extract" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.745257 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.749334 4903 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-g9blq" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.749977 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.773203 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.779672 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b"] Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.781986 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztbq7\" (UniqueName: \"kubernetes.io/projected/54c0cc0f-d40b-41ba-9fb6-75c446252e86-kube-api-access-ztbq7\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2hz7b\" (UID: \"54c0cc0f-d40b-41ba-9fb6-75c446252e86\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.782069 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54c0cc0f-d40b-41ba-9fb6-75c446252e86-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2hz7b\" (UID: \"54c0cc0f-d40b-41ba-9fb6-75c446252e86\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.883987 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztbq7\" (UniqueName: \"kubernetes.io/projected/54c0cc0f-d40b-41ba-9fb6-75c446252e86-kube-api-access-ztbq7\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2hz7b\" (UID: \"54c0cc0f-d40b-41ba-9fb6-75c446252e86\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.884064 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54c0cc0f-d40b-41ba-9fb6-75c446252e86-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2hz7b\" (UID: \"54c0cc0f-d40b-41ba-9fb6-75c446252e86\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.884543 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54c0cc0f-d40b-41ba-9fb6-75c446252e86-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2hz7b\" (UID: \"54c0cc0f-d40b-41ba-9fb6-75c446252e86\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" Mar 20 08:40:12 crc kubenswrapper[4903]: I0320 08:40:12.910128 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztbq7\" (UniqueName: \"kubernetes.io/projected/54c0cc0f-d40b-41ba-9fb6-75c446252e86-kube-api-access-ztbq7\") pod \"cert-manager-operator-controller-manager-66c8bdd694-2hz7b\" (UID: \"54c0cc0f-d40b-41ba-9fb6-75c446252e86\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" Mar 20 08:40:13 crc kubenswrapper[4903]: I0320 08:40:13.085788 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" Mar 20 08:40:13 crc kubenswrapper[4903]: I0320 08:40:13.383281 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b"] Mar 20 08:40:13 crc kubenswrapper[4903]: W0320 08:40:13.391354 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54c0cc0f_d40b_41ba_9fb6_75c446252e86.slice/crio-cf2041ac82eff39fd2bd25ea7672736433361b3f3b6a49b9e82e948a667f1200 WatchSource:0}: Error finding container cf2041ac82eff39fd2bd25ea7672736433361b3f3b6a49b9e82e948a667f1200: Status 404 returned error can't find the container with id cf2041ac82eff39fd2bd25ea7672736433361b3f3b6a49b9e82e948a667f1200 Mar 20 08:40:13 crc kubenswrapper[4903]: I0320 08:40:13.525383 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" event={"ID":"54c0cc0f-d40b-41ba-9fb6-75c446252e86","Type":"ContainerStarted","Data":"cf2041ac82eff39fd2bd25ea7672736433361b3f3b6a49b9e82e948a667f1200"} Mar 20 08:40:17 crc kubenswrapper[4903]: I0320 08:40:17.552747 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" event={"ID":"54c0cc0f-d40b-41ba-9fb6-75c446252e86","Type":"ContainerStarted","Data":"8404e4637d55432ef8a8934037e3492eb00be01a474fcfaec75a5ddd40317d14"} Mar 20 08:40:17 crc kubenswrapper[4903]: I0320 08:40:17.573819 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-2hz7b" podStartSLOduration=2.269622304 podStartE2EDuration="5.573800322s" podCreationTimestamp="2026-03-20 08:40:12 +0000 UTC" firstStartedPulling="2026-03-20 08:40:13.394429249 +0000 UTC m=+1038.611329564" lastFinishedPulling="2026-03-20 08:40:16.698607257 +0000 UTC m=+1041.915507582" observedRunningTime="2026-03-20 08:40:17.571498666 +0000 UTC m=+1042.788398981" watchObservedRunningTime="2026-03-20 08:40:17.573800322 +0000 UTC m=+1042.790700637" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.519731 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2bhwr"] Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.524639 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.527376 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.527813 4903 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l2xp4" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.528273 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.539161 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dtbs\" (UniqueName: \"kubernetes.io/projected/ffbf5e1f-ab1e-47fe-9171-a63130e38dec-kube-api-access-9dtbs\") pod \"cert-manager-webhook-6888856db4-2bhwr\" (UID: \"ffbf5e1f-ab1e-47fe-9171-a63130e38dec\") " pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.539540 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ffbf5e1f-ab1e-47fe-9171-a63130e38dec-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2bhwr\" (UID: \"ffbf5e1f-ab1e-47fe-9171-a63130e38dec\") " pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.587215 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2bhwr"] Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.640599 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dtbs\" (UniqueName: \"kubernetes.io/projected/ffbf5e1f-ab1e-47fe-9171-a63130e38dec-kube-api-access-9dtbs\") pod \"cert-manager-webhook-6888856db4-2bhwr\" (UID: \"ffbf5e1f-ab1e-47fe-9171-a63130e38dec\") " pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.640713 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ffbf5e1f-ab1e-47fe-9171-a63130e38dec-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2bhwr\" (UID: \"ffbf5e1f-ab1e-47fe-9171-a63130e38dec\") " pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.676778 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ffbf5e1f-ab1e-47fe-9171-a63130e38dec-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-2bhwr\" (UID: \"ffbf5e1f-ab1e-47fe-9171-a63130e38dec\") " pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.677015 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dtbs\" (UniqueName: \"kubernetes.io/projected/ffbf5e1f-ab1e-47fe-9171-a63130e38dec-kube-api-access-9dtbs\") pod \"cert-manager-webhook-6888856db4-2bhwr\" (UID: \"ffbf5e1f-ab1e-47fe-9171-a63130e38dec\") " pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.848861 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.992606 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-c22w9"] Mar 20 08:40:23 crc kubenswrapper[4903]: I0320 08:40:23.994281 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:23.998772 4903 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-4288f" Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.003886 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-c22w9"] Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.055175 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvqt2\" (UniqueName: \"kubernetes.io/projected/f9bcdff8-0f0a-4797-80e0-c3a7893223dd-kube-api-access-cvqt2\") pod \"cert-manager-cainjector-5545bd876-c22w9\" (UID: \"f9bcdff8-0f0a-4797-80e0-c3a7893223dd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.055251 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9bcdff8-0f0a-4797-80e0-c3a7893223dd-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-c22w9\" (UID: \"f9bcdff8-0f0a-4797-80e0-c3a7893223dd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.115875 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-2bhwr"] Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.155712 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvqt2\" (UniqueName: \"kubernetes.io/projected/f9bcdff8-0f0a-4797-80e0-c3a7893223dd-kube-api-access-cvqt2\") pod \"cert-manager-cainjector-5545bd876-c22w9\" (UID: \"f9bcdff8-0f0a-4797-80e0-c3a7893223dd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.155766 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9bcdff8-0f0a-4797-80e0-c3a7893223dd-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-c22w9\" (UID: \"f9bcdff8-0f0a-4797-80e0-c3a7893223dd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.178655 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9bcdff8-0f0a-4797-80e0-c3a7893223dd-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-c22w9\" (UID: \"f9bcdff8-0f0a-4797-80e0-c3a7893223dd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.178762 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvqt2\" (UniqueName: \"kubernetes.io/projected/f9bcdff8-0f0a-4797-80e0-c3a7893223dd-kube-api-access-cvqt2\") pod \"cert-manager-cainjector-5545bd876-c22w9\" (UID: \"f9bcdff8-0f0a-4797-80e0-c3a7893223dd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.320124 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.549436 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-c22w9"] Mar 20 08:40:24 crc kubenswrapper[4903]: W0320 08:40:24.551434 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9bcdff8_0f0a_4797_80e0_c3a7893223dd.slice/crio-be301aa9ca7f4ec2a614dd8f49f236ad435ba03494f34c9aa0397d43d0bf722f WatchSource:0}: Error finding container be301aa9ca7f4ec2a614dd8f49f236ad435ba03494f34c9aa0397d43d0bf722f: Status 404 returned error can't find the container with id be301aa9ca7f4ec2a614dd8f49f236ad435ba03494f34c9aa0397d43d0bf722f Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.594857 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" event={"ID":"ffbf5e1f-ab1e-47fe-9171-a63130e38dec","Type":"ContainerStarted","Data":"96895955ddc7f248b9c689795bb6f577334a23ca02f3e8ef809b3e8b8e7e3d05"} Mar 20 08:40:24 crc kubenswrapper[4903]: I0320 08:40:24.596066 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" event={"ID":"f9bcdff8-0f0a-4797-80e0-c3a7893223dd","Type":"ContainerStarted","Data":"be301aa9ca7f4ec2a614dd8f49f236ad435ba03494f34c9aa0397d43d0bf722f"} Mar 20 08:40:29 crc kubenswrapper[4903]: I0320 08:40:29.634148 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" event={"ID":"ffbf5e1f-ab1e-47fe-9171-a63130e38dec","Type":"ContainerStarted","Data":"058377dc89cad37e360158288b7cface1ec9334770362854482ace6195b57fab"} Mar 20 08:40:29 crc kubenswrapper[4903]: I0320 08:40:29.634669 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" Mar 20 08:40:29 crc kubenswrapper[4903]: I0320 08:40:29.635761 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" event={"ID":"f9bcdff8-0f0a-4797-80e0-c3a7893223dd","Type":"ContainerStarted","Data":"3d54c1821f690386e62c5e80707f274e2a5074afc3fcc2654838536b8a82d75a"} Mar 20 08:40:29 crc kubenswrapper[4903]: I0320 08:40:29.649068 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" podStartSLOduration=1.668102349 podStartE2EDuration="6.649023307s" podCreationTimestamp="2026-03-20 08:40:23 +0000 UTC" firstStartedPulling="2026-03-20 08:40:24.132851095 +0000 UTC m=+1049.349751410" lastFinishedPulling="2026-03-20 08:40:29.113772053 +0000 UTC m=+1054.330672368" observedRunningTime="2026-03-20 08:40:29.647535251 +0000 UTC m=+1054.864435566" watchObservedRunningTime="2026-03-20 08:40:29.649023307 +0000 UTC m=+1054.865923612" Mar 20 08:40:29 crc kubenswrapper[4903]: I0320 08:40:29.667740 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-c22w9" podStartSLOduration=2.091511419 podStartE2EDuration="6.667523434s" podCreationTimestamp="2026-03-20 08:40:23 +0000 UTC" firstStartedPulling="2026-03-20 08:40:24.553540259 +0000 UTC m=+1049.770440574" lastFinishedPulling="2026-03-20 08:40:29.129552264 +0000 UTC m=+1054.346452589" observedRunningTime="2026-03-20 08:40:29.66363404 +0000 UTC m=+1054.880534355" watchObservedRunningTime="2026-03-20 08:40:29.667523434 +0000 UTC m=+1054.884423739" Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.056941 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-pnf4r"] Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.058300 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-pnf4r" Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.064217 4903 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ts5jn" Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.079420 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-pnf4r"] Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.154688 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fd09484-0087-4f85-a1e3-a67036f4cbca-bound-sa-token\") pod \"cert-manager-545d4d4674-pnf4r\" (UID: \"4fd09484-0087-4f85-a1e3-a67036f4cbca\") " pod="cert-manager/cert-manager-545d4d4674-pnf4r" Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.154912 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzlq\" (UniqueName: \"kubernetes.io/projected/4fd09484-0087-4f85-a1e3-a67036f4cbca-kube-api-access-pjzlq\") pod \"cert-manager-545d4d4674-pnf4r\" (UID: \"4fd09484-0087-4f85-a1e3-a67036f4cbca\") " pod="cert-manager/cert-manager-545d4d4674-pnf4r" Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.256953 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzlq\" (UniqueName: \"kubernetes.io/projected/4fd09484-0087-4f85-a1e3-a67036f4cbca-kube-api-access-pjzlq\") pod \"cert-manager-545d4d4674-pnf4r\" (UID: \"4fd09484-0087-4f85-a1e3-a67036f4cbca\") " pod="cert-manager/cert-manager-545d4d4674-pnf4r" Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.257106 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fd09484-0087-4f85-a1e3-a67036f4cbca-bound-sa-token\") pod \"cert-manager-545d4d4674-pnf4r\" (UID: \"4fd09484-0087-4f85-a1e3-a67036f4cbca\") " pod="cert-manager/cert-manager-545d4d4674-pnf4r" Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.286272 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzlq\" (UniqueName: \"kubernetes.io/projected/4fd09484-0087-4f85-a1e3-a67036f4cbca-kube-api-access-pjzlq\") pod \"cert-manager-545d4d4674-pnf4r\" (UID: \"4fd09484-0087-4f85-a1e3-a67036f4cbca\") " pod="cert-manager/cert-manager-545d4d4674-pnf4r" Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.286961 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fd09484-0087-4f85-a1e3-a67036f4cbca-bound-sa-token\") pod \"cert-manager-545d4d4674-pnf4r\" (UID: \"4fd09484-0087-4f85-a1e3-a67036f4cbca\") " pod="cert-manager/cert-manager-545d4d4674-pnf4r" Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.377085 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-pnf4r" Mar 20 08:40:30 crc kubenswrapper[4903]: I0320 08:40:30.826223 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-pnf4r"] Mar 20 08:40:31 crc kubenswrapper[4903]: I0320 08:40:31.652896 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-pnf4r" event={"ID":"4fd09484-0087-4f85-a1e3-a67036f4cbca","Type":"ContainerStarted","Data":"50297042d69b71bf15d44b265b0df7bdbaa8617423a0c72a45f8479aa243979f"} Mar 20 08:40:31 crc kubenswrapper[4903]: I0320 08:40:31.653478 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-pnf4r" event={"ID":"4fd09484-0087-4f85-a1e3-a67036f4cbca","Type":"ContainerStarted","Data":"c7eae68437cb0f6c2bfc76ef6a860212d3dd11a75537f33526c16355da7de3c0"} Mar 20 08:40:31 crc kubenswrapper[4903]: I0320 08:40:31.678939 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-pnf4r" podStartSLOduration=1.678902927 podStartE2EDuration="1.678902927s" podCreationTimestamp="2026-03-20 08:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:40:31.67614225 +0000 UTC m=+1056.893042565" watchObservedRunningTime="2026-03-20 08:40:31.678902927 +0000 UTC m=+1056.895803282" Mar 20 08:40:38 crc kubenswrapper[4903]: I0320 08:40:38.852660 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-2bhwr" Mar 20 08:40:42 crc kubenswrapper[4903]: I0320 08:40:42.435855 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mnrlj"] Mar 20 08:40:42 crc kubenswrapper[4903]: I0320 08:40:42.437788 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mnrlj" Mar 20 08:40:42 crc kubenswrapper[4903]: I0320 08:40:42.447301 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9xxzh" Mar 20 08:40:42 crc kubenswrapper[4903]: I0320 08:40:42.447902 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 08:40:42 crc kubenswrapper[4903]: I0320 08:40:42.448446 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mnrlj"] Mar 20 08:40:42 crc kubenswrapper[4903]: I0320 08:40:42.450899 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 08:40:42 crc kubenswrapper[4903]: I0320 08:40:42.601536 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvh4w\" (UniqueName: \"kubernetes.io/projected/59a4b8dd-25ff-4fb1-88d1-e3ff83af526f-kube-api-access-hvh4w\") pod \"openstack-operator-index-mnrlj\" (UID: \"59a4b8dd-25ff-4fb1-88d1-e3ff83af526f\") " pod="openstack-operators/openstack-operator-index-mnrlj" Mar 20 08:40:42 crc kubenswrapper[4903]: I0320 08:40:42.703173 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvh4w\" (UniqueName: \"kubernetes.io/projected/59a4b8dd-25ff-4fb1-88d1-e3ff83af526f-kube-api-access-hvh4w\") pod \"openstack-operator-index-mnrlj\" (UID: \"59a4b8dd-25ff-4fb1-88d1-e3ff83af526f\") " pod="openstack-operators/openstack-operator-index-mnrlj" Mar 20 08:40:42 crc kubenswrapper[4903]: I0320 08:40:42.734300 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvh4w\" (UniqueName: \"kubernetes.io/projected/59a4b8dd-25ff-4fb1-88d1-e3ff83af526f-kube-api-access-hvh4w\") pod \"openstack-operator-index-mnrlj\" (UID: \"59a4b8dd-25ff-4fb1-88d1-e3ff83af526f\") " pod="openstack-operators/openstack-operator-index-mnrlj" Mar 20 08:40:42 crc kubenswrapper[4903]: I0320 08:40:42.780567 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mnrlj" Mar 20 08:40:43 crc kubenswrapper[4903]: I0320 08:40:43.115619 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mnrlj"] Mar 20 08:40:43 crc kubenswrapper[4903]: I0320 08:40:43.750615 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mnrlj" event={"ID":"59a4b8dd-25ff-4fb1-88d1-e3ff83af526f","Type":"ContainerStarted","Data":"3fb6a22a81fc4d91ad74a993d13297587f13399979ab5761eea9273d00b7d10f"} Mar 20 08:40:45 crc kubenswrapper[4903]: I0320 08:40:45.788059 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mnrlj"] Mar 20 08:40:46 crc kubenswrapper[4903]: I0320 08:40:46.398698 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gwb86"] Mar 20 08:40:46 crc kubenswrapper[4903]: I0320 08:40:46.400238 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gwb86" Mar 20 08:40:46 crc kubenswrapper[4903]: I0320 08:40:46.421834 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gwb86"] Mar 20 08:40:46 crc kubenswrapper[4903]: I0320 08:40:46.568477 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zjqg\" (UniqueName: \"kubernetes.io/projected/22a4229d-fa0a-4948-ab60-3f2c5d1c72df-kube-api-access-2zjqg\") pod \"openstack-operator-index-gwb86\" (UID: \"22a4229d-fa0a-4948-ab60-3f2c5d1c72df\") " pod="openstack-operators/openstack-operator-index-gwb86" Mar 20 08:40:46 crc kubenswrapper[4903]: I0320 08:40:46.670357 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zjqg\" (UniqueName: \"kubernetes.io/projected/22a4229d-fa0a-4948-ab60-3f2c5d1c72df-kube-api-access-2zjqg\") pod \"openstack-operator-index-gwb86\" (UID: \"22a4229d-fa0a-4948-ab60-3f2c5d1c72df\") " pod="openstack-operators/openstack-operator-index-gwb86" Mar 20 08:40:46 crc kubenswrapper[4903]: I0320 08:40:46.706858 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zjqg\" (UniqueName: \"kubernetes.io/projected/22a4229d-fa0a-4948-ab60-3f2c5d1c72df-kube-api-access-2zjqg\") pod \"openstack-operator-index-gwb86\" (UID: \"22a4229d-fa0a-4948-ab60-3f2c5d1c72df\") " pod="openstack-operators/openstack-operator-index-gwb86" Mar 20 08:40:46 crc kubenswrapper[4903]: I0320 08:40:46.722537 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gwb86" Mar 20 08:40:46 crc kubenswrapper[4903]: I0320 08:40:46.793214 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mnrlj" event={"ID":"59a4b8dd-25ff-4fb1-88d1-e3ff83af526f","Type":"ContainerStarted","Data":"38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3"} Mar 20 08:40:46 crc kubenswrapper[4903]: I0320 08:40:46.793417 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mnrlj" podUID="59a4b8dd-25ff-4fb1-88d1-e3ff83af526f" containerName="registry-server" containerID="cri-o://38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3" gracePeriod=2 Mar 20 08:40:46 crc kubenswrapper[4903]: I0320 08:40:46.832464 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mnrlj" podStartSLOduration=2.247078204 podStartE2EDuration="4.832440988s" podCreationTimestamp="2026-03-20 08:40:42 +0000 UTC" firstStartedPulling="2026-03-20 08:40:43.133358 +0000 UTC m=+1068.350258335" lastFinishedPulling="2026-03-20 08:40:45.718720804 +0000 UTC m=+1070.935621119" observedRunningTime="2026-03-20 08:40:46.825559452 +0000 UTC m=+1072.042459807" watchObservedRunningTime="2026-03-20 08:40:46.832440988 +0000 UTC m=+1072.049341303" Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.183706 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gwb86"] Mar 20 08:40:47 crc kubenswrapper[4903]: W0320 08:40:47.185866 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22a4229d_fa0a_4948_ab60_3f2c5d1c72df.slice/crio-3705624ac913f3436ec9e73adfbe87c510e7dd73c6ea64be3141ed38ff62dd07 WatchSource:0}: Error finding container 3705624ac913f3436ec9e73adfbe87c510e7dd73c6ea64be3141ed38ff62dd07: Status 404 returned error can't find the container with id 3705624ac913f3436ec9e73adfbe87c510e7dd73c6ea64be3141ed38ff62dd07 Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.187984 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mnrlj" Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.295897 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvh4w\" (UniqueName: \"kubernetes.io/projected/59a4b8dd-25ff-4fb1-88d1-e3ff83af526f-kube-api-access-hvh4w\") pod \"59a4b8dd-25ff-4fb1-88d1-e3ff83af526f\" (UID: \"59a4b8dd-25ff-4fb1-88d1-e3ff83af526f\") " Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.301772 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59a4b8dd-25ff-4fb1-88d1-e3ff83af526f-kube-api-access-hvh4w" (OuterVolumeSpecName: "kube-api-access-hvh4w") pod "59a4b8dd-25ff-4fb1-88d1-e3ff83af526f" (UID: "59a4b8dd-25ff-4fb1-88d1-e3ff83af526f"). InnerVolumeSpecName "kube-api-access-hvh4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.397221 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvh4w\" (UniqueName: \"kubernetes.io/projected/59a4b8dd-25ff-4fb1-88d1-e3ff83af526f-kube-api-access-hvh4w\") on node \"crc\" DevicePath \"\"" Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.804212 4903 generic.go:334] "Generic (PLEG): container finished" podID="59a4b8dd-25ff-4fb1-88d1-e3ff83af526f" containerID="38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3" exitCode=0 Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.804367 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mnrlj" Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.804431 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mnrlj" event={"ID":"59a4b8dd-25ff-4fb1-88d1-e3ff83af526f","Type":"ContainerDied","Data":"38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3"} Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.804546 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mnrlj" event={"ID":"59a4b8dd-25ff-4fb1-88d1-e3ff83af526f","Type":"ContainerDied","Data":"3fb6a22a81fc4d91ad74a993d13297587f13399979ab5761eea9273d00b7d10f"} Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.804589 4903 scope.go:117] "RemoveContainer" containerID="38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3" Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.806200 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gwb86" event={"ID":"22a4229d-fa0a-4948-ab60-3f2c5d1c72df","Type":"ContainerStarted","Data":"67741ac23993fdfe7c37e2378ee8e65940efac7cb6f46626cb0ff53e71aecfa1"} Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.806256 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gwb86" event={"ID":"22a4229d-fa0a-4948-ab60-3f2c5d1c72df","Type":"ContainerStarted","Data":"3705624ac913f3436ec9e73adfbe87c510e7dd73c6ea64be3141ed38ff62dd07"} Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.837537 4903 scope.go:117] "RemoveContainer" containerID="38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3" Mar 20 08:40:47 crc kubenswrapper[4903]: E0320 08:40:47.838170 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3\": container with ID starting with 38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3 not found: ID does not exist" containerID="38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3" Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.838200 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3"} err="failed to get container status \"38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3\": rpc error: code = NotFound desc = could not find container \"38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3\": container with ID starting with 38992aa210bb78e740cbdd86107e66c09d6e72712edefd125d35ec3914ce34f3 not found: ID does not exist" Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.853724 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gwb86" podStartSLOduration=1.798452089 podStartE2EDuration="1.853691916s" podCreationTimestamp="2026-03-20 08:40:46 +0000 UTC" firstStartedPulling="2026-03-20 08:40:47.197464605 +0000 UTC m=+1072.414364930" lastFinishedPulling="2026-03-20 08:40:47.252704432 +0000 UTC m=+1072.469604757" observedRunningTime="2026-03-20 08:40:47.8323686 +0000 UTC m=+1073.049268985" watchObservedRunningTime="2026-03-20 08:40:47.853691916 +0000 UTC m=+1073.070592271" Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.864272 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mnrlj"] Mar 20 08:40:47 crc kubenswrapper[4903]: I0320 08:40:47.871899 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mnrlj"] Mar 20 08:40:49 crc kubenswrapper[4903]: I0320 08:40:49.505388 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59a4b8dd-25ff-4fb1-88d1-e3ff83af526f" path="/var/lib/kubelet/pods/59a4b8dd-25ff-4fb1-88d1-e3ff83af526f/volumes" Mar 20 08:40:56 crc kubenswrapper[4903]: I0320 08:40:56.722863 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-gwb86" Mar 20 08:40:56 crc kubenswrapper[4903]: I0320 08:40:56.723527 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-gwb86" Mar 20 08:40:56 crc kubenswrapper[4903]: I0320 08:40:56.771749 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-gwb86" Mar 20 08:40:56 crc kubenswrapper[4903]: I0320 08:40:56.859679 4903 scope.go:117] "RemoveContainer" containerID="11a9fb4c401a7bb0b4717c57cc5525846cd3b28f869642be3c74e0b6a86eaf93" Mar 20 08:40:56 crc kubenswrapper[4903]: I0320 08:40:56.937741 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-gwb86" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.400538 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl"] Mar 20 08:41:03 crc kubenswrapper[4903]: E0320 08:41:03.401698 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59a4b8dd-25ff-4fb1-88d1-e3ff83af526f" containerName="registry-server" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.401726 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="59a4b8dd-25ff-4fb1-88d1-e3ff83af526f" containerName="registry-server" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.401959 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="59a4b8dd-25ff-4fb1-88d1-e3ff83af526f" containerName="registry-server" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.403324 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.413227 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-99bmd" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.417660 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl"] Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.568922 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-bundle\") pod \"bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.569041 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7b7p\" (UniqueName: \"kubernetes.io/projected/34f42916-3cdf-413e-92df-7066282621a4-kube-api-access-c7b7p\") pod \"bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.569104 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-util\") pod \"bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.670409 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-util\") pod \"bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.670569 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-bundle\") pod \"bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.670636 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7b7p\" (UniqueName: \"kubernetes.io/projected/34f42916-3cdf-413e-92df-7066282621a4-kube-api-access-c7b7p\") pod \"bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.671446 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-util\") pod \"bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.671607 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-bundle\") pod \"bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.705891 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7b7p\" (UniqueName: \"kubernetes.io/projected/34f42916-3cdf-413e-92df-7066282621a4-kube-api-access-c7b7p\") pod \"bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:03 crc kubenswrapper[4903]: I0320 08:41:03.728701 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:04 crc kubenswrapper[4903]: I0320 08:41:04.214317 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl"] Mar 20 08:41:04 crc kubenswrapper[4903]: I0320 08:41:04.979438 4903 generic.go:334] "Generic (PLEG): container finished" podID="34f42916-3cdf-413e-92df-7066282621a4" containerID="fbb3686ac6e1b48c5b30bfed03899f07da704b705e957074bde3e8ca09d00cbb" exitCode=0 Mar 20 08:41:04 crc kubenswrapper[4903]: I0320 08:41:04.979548 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" event={"ID":"34f42916-3cdf-413e-92df-7066282621a4","Type":"ContainerDied","Data":"fbb3686ac6e1b48c5b30bfed03899f07da704b705e957074bde3e8ca09d00cbb"} Mar 20 08:41:04 crc kubenswrapper[4903]: I0320 08:41:04.980093 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" event={"ID":"34f42916-3cdf-413e-92df-7066282621a4","Type":"ContainerStarted","Data":"bb121f194f058fcbd4ffb90f3180c27ab6ac0cd19b911f980e36c2de6390ffa0"} Mar 20 08:41:05 crc kubenswrapper[4903]: I0320 08:41:05.994921 4903 generic.go:334] "Generic (PLEG): container finished" podID="34f42916-3cdf-413e-92df-7066282621a4" containerID="88d15ee908acdfce8a232aadba14b7668ef1b3d4254b136286e4b6636d022d47" exitCode=0 Mar 20 08:41:05 crc kubenswrapper[4903]: I0320 08:41:05.996199 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" event={"ID":"34f42916-3cdf-413e-92df-7066282621a4","Type":"ContainerDied","Data":"88d15ee908acdfce8a232aadba14b7668ef1b3d4254b136286e4b6636d022d47"} Mar 20 08:41:07 crc kubenswrapper[4903]: I0320 08:41:07.008392 4903 generic.go:334] "Generic (PLEG): container finished" podID="34f42916-3cdf-413e-92df-7066282621a4" containerID="17adb877edfd01a3d865e46ae6fe6eeabe2f965519309e7b7f67d99d09cb05f4" exitCode=0 Mar 20 08:41:07 crc kubenswrapper[4903]: I0320 08:41:07.008531 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" event={"ID":"34f42916-3cdf-413e-92df-7066282621a4","Type":"ContainerDied","Data":"17adb877edfd01a3d865e46ae6fe6eeabe2f965519309e7b7f67d99d09cb05f4"} Mar 20 08:41:08 crc kubenswrapper[4903]: I0320 08:41:08.338225 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:08 crc kubenswrapper[4903]: I0320 08:41:08.370355 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7b7p\" (UniqueName: \"kubernetes.io/projected/34f42916-3cdf-413e-92df-7066282621a4-kube-api-access-c7b7p\") pod \"34f42916-3cdf-413e-92df-7066282621a4\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " Mar 20 08:41:08 crc kubenswrapper[4903]: I0320 08:41:08.370416 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-bundle\") pod \"34f42916-3cdf-413e-92df-7066282621a4\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " Mar 20 08:41:08 crc kubenswrapper[4903]: I0320 08:41:08.370532 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-util\") pod \"34f42916-3cdf-413e-92df-7066282621a4\" (UID: \"34f42916-3cdf-413e-92df-7066282621a4\") " Mar 20 08:41:08 crc kubenswrapper[4903]: I0320 08:41:08.371638 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-bundle" (OuterVolumeSpecName: "bundle") pod "34f42916-3cdf-413e-92df-7066282621a4" (UID: "34f42916-3cdf-413e-92df-7066282621a4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:41:08 crc kubenswrapper[4903]: I0320 08:41:08.377192 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f42916-3cdf-413e-92df-7066282621a4-kube-api-access-c7b7p" (OuterVolumeSpecName: "kube-api-access-c7b7p") pod "34f42916-3cdf-413e-92df-7066282621a4" (UID: "34f42916-3cdf-413e-92df-7066282621a4"). InnerVolumeSpecName "kube-api-access-c7b7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:41:08 crc kubenswrapper[4903]: I0320 08:41:08.391208 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-util" (OuterVolumeSpecName: "util") pod "34f42916-3cdf-413e-92df-7066282621a4" (UID: "34f42916-3cdf-413e-92df-7066282621a4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:41:08 crc kubenswrapper[4903]: I0320 08:41:08.472795 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7b7p\" (UniqueName: \"kubernetes.io/projected/34f42916-3cdf-413e-92df-7066282621a4-kube-api-access-c7b7p\") on node \"crc\" DevicePath \"\"" Mar 20 08:41:08 crc kubenswrapper[4903]: I0320 08:41:08.472854 4903 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:41:08 crc kubenswrapper[4903]: I0320 08:41:08.472872 4903 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f42916-3cdf-413e-92df-7066282621a4-util\") on node \"crc\" DevicePath \"\"" Mar 20 08:41:09 crc kubenswrapper[4903]: I0320 08:41:09.032576 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" event={"ID":"34f42916-3cdf-413e-92df-7066282621a4","Type":"ContainerDied","Data":"bb121f194f058fcbd4ffb90f3180c27ab6ac0cd19b911f980e36c2de6390ffa0"} Mar 20 08:41:09 crc kubenswrapper[4903]: I0320 08:41:09.033156 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb121f194f058fcbd4ffb90f3180c27ab6ac0cd19b911f980e36c2de6390ffa0" Mar 20 08:41:09 crc kubenswrapper[4903]: I0320 08:41:09.032681 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.499601 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6"] Mar 20 08:41:15 crc kubenswrapper[4903]: E0320 08:41:15.500451 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f42916-3cdf-413e-92df-7066282621a4" containerName="util" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.500466 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f42916-3cdf-413e-92df-7066282621a4" containerName="util" Mar 20 08:41:15 crc kubenswrapper[4903]: E0320 08:41:15.500475 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f42916-3cdf-413e-92df-7066282621a4" containerName="extract" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.500481 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f42916-3cdf-413e-92df-7066282621a4" containerName="extract" Mar 20 08:41:15 crc kubenswrapper[4903]: E0320 08:41:15.500491 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f42916-3cdf-413e-92df-7066282621a4" containerName="pull" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.500497 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f42916-3cdf-413e-92df-7066282621a4" containerName="pull" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.500618 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f42916-3cdf-413e-92df-7066282621a4" containerName="extract" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.501099 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.507801 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-4nhd7" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.535336 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6"] Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.581959 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj8lg\" (UniqueName: \"kubernetes.io/projected/15cbed57-b491-4c0b-94d3-cfb6a3c7a624-kube-api-access-bj8lg\") pod \"openstack-operator-controller-init-7f6c6d49c4-j9tg6\" (UID: \"15cbed57-b491-4c0b-94d3-cfb6a3c7a624\") " pod="openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.683527 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj8lg\" (UniqueName: \"kubernetes.io/projected/15cbed57-b491-4c0b-94d3-cfb6a3c7a624-kube-api-access-bj8lg\") pod \"openstack-operator-controller-init-7f6c6d49c4-j9tg6\" (UID: \"15cbed57-b491-4c0b-94d3-cfb6a3c7a624\") " pod="openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.706158 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj8lg\" (UniqueName: \"kubernetes.io/projected/15cbed57-b491-4c0b-94d3-cfb6a3c7a624-kube-api-access-bj8lg\") pod \"openstack-operator-controller-init-7f6c6d49c4-j9tg6\" (UID: \"15cbed57-b491-4c0b-94d3-cfb6a3c7a624\") " pod="openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6" Mar 20 08:41:15 crc kubenswrapper[4903]: I0320 08:41:15.823233 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6" Mar 20 08:41:16 crc kubenswrapper[4903]: I0320 08:41:16.202395 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6"] Mar 20 08:41:17 crc kubenswrapper[4903]: I0320 08:41:17.232492 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6" event={"ID":"15cbed57-b491-4c0b-94d3-cfb6a3c7a624","Type":"ContainerStarted","Data":"cdc5c88ac8b8d5e5919868edf2a726cb406db01249deb79434dbccbbffa52f86"} Mar 20 08:41:20 crc kubenswrapper[4903]: I0320 08:41:20.833305 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:41:20 crc kubenswrapper[4903]: I0320 08:41:20.833838 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:41:22 crc kubenswrapper[4903]: I0320 08:41:22.273240 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6" event={"ID":"15cbed57-b491-4c0b-94d3-cfb6a3c7a624","Type":"ContainerStarted","Data":"92d46d0bbe10acfb90676ad40700d30f1cf555bf4cbbea374f13d03e34479652"} Mar 20 08:41:22 crc kubenswrapper[4903]: I0320 08:41:22.273785 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6" Mar 20 08:41:22 crc kubenswrapper[4903]: I0320 08:41:22.332004 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6" podStartSLOduration=1.99195186 podStartE2EDuration="7.331977399s" podCreationTimestamp="2026-03-20 08:41:15 +0000 UTC" firstStartedPulling="2026-03-20 08:41:16.204605747 +0000 UTC m=+1101.421506062" lastFinishedPulling="2026-03-20 08:41:21.544631246 +0000 UTC m=+1106.761531601" observedRunningTime="2026-03-20 08:41:22.328148063 +0000 UTC m=+1107.545048388" watchObservedRunningTime="2026-03-20 08:41:22.331977399 +0000 UTC m=+1107.548877714" Mar 20 08:41:35 crc kubenswrapper[4903]: I0320 08:41:35.840666 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7f6c6d49c4-j9tg6" Mar 20 08:41:50 crc kubenswrapper[4903]: I0320 08:41:50.833778 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:41:50 crc kubenswrapper[4903]: I0320 08:41:50.834645 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.158690 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566602-66tvt"] Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.159849 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566602-66tvt" Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.167889 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.167889 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.183320 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.188564 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566602-66tvt"] Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.204487 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8svz\" (UniqueName: \"kubernetes.io/projected/39ab3243-abaa-4705-88ec-4b998774e880-kube-api-access-n8svz\") pod \"auto-csr-approver-29566602-66tvt\" (UID: \"39ab3243-abaa-4705-88ec-4b998774e880\") " pod="openshift-infra/auto-csr-approver-29566602-66tvt" Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.305374 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8svz\" (UniqueName: \"kubernetes.io/projected/39ab3243-abaa-4705-88ec-4b998774e880-kube-api-access-n8svz\") pod \"auto-csr-approver-29566602-66tvt\" (UID: \"39ab3243-abaa-4705-88ec-4b998774e880\") " pod="openshift-infra/auto-csr-approver-29566602-66tvt" Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.336681 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8svz\" (UniqueName: \"kubernetes.io/projected/39ab3243-abaa-4705-88ec-4b998774e880-kube-api-access-n8svz\") pod \"auto-csr-approver-29566602-66tvt\" (UID: \"39ab3243-abaa-4705-88ec-4b998774e880\") " pod="openshift-infra/auto-csr-approver-29566602-66tvt" Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.477707 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566602-66tvt" Mar 20 08:42:00 crc kubenswrapper[4903]: I0320 08:42:00.790792 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566602-66tvt"] Mar 20 08:42:01 crc kubenswrapper[4903]: I0320 08:42:01.588567 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566602-66tvt" event={"ID":"39ab3243-abaa-4705-88ec-4b998774e880","Type":"ContainerStarted","Data":"042ba72f04718b251b2bf6adfd38afff625b71d6a076c61a92743931d36d7978"} Mar 20 08:42:02 crc kubenswrapper[4903]: I0320 08:42:02.597829 4903 generic.go:334] "Generic (PLEG): container finished" podID="39ab3243-abaa-4705-88ec-4b998774e880" containerID="d9556cda66aa8de78ade2a6e0fddee8f9a0c9432b0aad91daf1068df0400aaf0" exitCode=0 Mar 20 08:42:02 crc kubenswrapper[4903]: I0320 08:42:02.597918 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566602-66tvt" event={"ID":"39ab3243-abaa-4705-88ec-4b998774e880","Type":"ContainerDied","Data":"d9556cda66aa8de78ade2a6e0fddee8f9a0c9432b0aad91daf1068df0400aaf0"} Mar 20 08:42:03 crc kubenswrapper[4903]: I0320 08:42:03.858956 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566602-66tvt" Mar 20 08:42:03 crc kubenswrapper[4903]: I0320 08:42:03.863785 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8svz\" (UniqueName: \"kubernetes.io/projected/39ab3243-abaa-4705-88ec-4b998774e880-kube-api-access-n8svz\") pod \"39ab3243-abaa-4705-88ec-4b998774e880\" (UID: \"39ab3243-abaa-4705-88ec-4b998774e880\") " Mar 20 08:42:03 crc kubenswrapper[4903]: I0320 08:42:03.869863 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ab3243-abaa-4705-88ec-4b998774e880-kube-api-access-n8svz" (OuterVolumeSpecName: "kube-api-access-n8svz") pod "39ab3243-abaa-4705-88ec-4b998774e880" (UID: "39ab3243-abaa-4705-88ec-4b998774e880"). InnerVolumeSpecName "kube-api-access-n8svz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:42:03 crc kubenswrapper[4903]: I0320 08:42:03.965176 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8svz\" (UniqueName: \"kubernetes.io/projected/39ab3243-abaa-4705-88ec-4b998774e880-kube-api-access-n8svz\") on node \"crc\" DevicePath \"\"" Mar 20 08:42:04 crc kubenswrapper[4903]: I0320 08:42:04.614075 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566602-66tvt" event={"ID":"39ab3243-abaa-4705-88ec-4b998774e880","Type":"ContainerDied","Data":"042ba72f04718b251b2bf6adfd38afff625b71d6a076c61a92743931d36d7978"} Mar 20 08:42:04 crc kubenswrapper[4903]: I0320 08:42:04.614124 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="042ba72f04718b251b2bf6adfd38afff625b71d6a076c61a92743931d36d7978" Mar 20 08:42:04 crc kubenswrapper[4903]: I0320 08:42:04.614136 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566602-66tvt" Mar 20 08:42:04 crc kubenswrapper[4903]: I0320 08:42:04.919378 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566596-wdwc5"] Mar 20 08:42:04 crc kubenswrapper[4903]: I0320 08:42:04.930178 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566596-wdwc5"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.464094 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj"] Mar 20 08:42:05 crc kubenswrapper[4903]: E0320 08:42:05.464573 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ab3243-abaa-4705-88ec-4b998774e880" containerName="oc" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.464585 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ab3243-abaa-4705-88ec-4b998774e880" containerName="oc" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.464723 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ab3243-abaa-4705-88ec-4b998774e880" containerName="oc" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.465139 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.467047 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qnxzg" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.473593 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.474665 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.477200 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-t6q7n" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.489609 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7z9w\" (UniqueName: \"kubernetes.io/projected/246db1f4-c0bc-4152-9275-dec8e8ca6233-kube-api-access-s7z9w\") pod \"cinder-operator-controller-manager-8d58dc466-9pxrs\" (UID: \"246db1f4-c0bc-4152-9275-dec8e8ca6233\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.499136 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45c6043-2cab-4100-9ba8-1942c427704c" path="/var/lib/kubelet/pods/a45c6043-2cab-4100-9ba8-1942c427704c/volumes" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.499821 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.516459 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.517495 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.520175 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gq2h6" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.558084 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.592188 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh67l\" (UniqueName: \"kubernetes.io/projected/ce3b9536-0686-4544-9ddb-c8e197b5d24a-kube-api-access-mh67l\") pod \"barbican-operator-controller-manager-59bc569d95-kzflj\" (UID: \"ce3b9536-0686-4544-9ddb-c8e197b5d24a\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.592345 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7z9w\" (UniqueName: \"kubernetes.io/projected/246db1f4-c0bc-4152-9275-dec8e8ca6233-kube-api-access-s7z9w\") pod \"cinder-operator-controller-manager-8d58dc466-9pxrs\" (UID: \"246db1f4-c0bc-4152-9275-dec8e8ca6233\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.609927 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.610931 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.615229 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-vhj9m" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.628925 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7z9w\" (UniqueName: \"kubernetes.io/projected/246db1f4-c0bc-4152-9275-dec8e8ca6233-kube-api-access-s7z9w\") pod \"cinder-operator-controller-manager-8d58dc466-9pxrs\" (UID: \"246db1f4-c0bc-4152-9275-dec8e8ca6233\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.633434 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.634488 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.637135 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-659rc" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.647556 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.672069 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.678728 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.679852 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.683862 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-44sk8" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.684077 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.694358 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh67l\" (UniqueName: \"kubernetes.io/projected/ce3b9536-0686-4544-9ddb-c8e197b5d24a-kube-api-access-mh67l\") pod \"barbican-operator-controller-manager-59bc569d95-kzflj\" (UID: \"ce3b9536-0686-4544-9ddb-c8e197b5d24a\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.694405 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l2sn\" (UniqueName: \"kubernetes.io/projected/e45265ca-9523-407d-b93a-16fc26817060-kube-api-access-5l2sn\") pod \"designate-operator-controller-manager-588d4d986b-rwzl9\" (UID: \"e45265ca-9523-407d-b93a-16fc26817060\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.704836 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.718174 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh67l\" (UniqueName: \"kubernetes.io/projected/ce3b9536-0686-4544-9ddb-c8e197b5d24a-kube-api-access-mh67l\") pod \"barbican-operator-controller-manager-59bc569d95-kzflj\" (UID: \"ce3b9536-0686-4544-9ddb-c8e197b5d24a\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.720784 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.723020 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.728876 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-t7m6k" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.729062 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.729887 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.740858 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.741910 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.744878 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-dpxvd" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.758816 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.779537 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.780773 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.783077 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.789908 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xpckc" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.790805 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.796012 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.796499 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8bw4\" (UniqueName: \"kubernetes.io/projected/41774efc-3c12-43fb-b4a3-023e5e4811f5-kube-api-access-w8bw4\") pod \"glance-operator-controller-manager-79df6bcc97-f4b4f\" (UID: \"41774efc-3c12-43fb-b4a3-023e5e4811f5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.796589 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6dvb\" (UniqueName: \"kubernetes.io/projected/dbc3863b-0f31-47c1-af79-58e6387d5a18-kube-api-access-s6dvb\") pod \"heat-operator-controller-manager-67dd5f86f5-cnjpn\" (UID: \"dbc3863b-0f31-47c1-af79-58e6387d5a18\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.800558 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.801696 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.808632 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsmc2\" (UniqueName: \"kubernetes.io/projected/c9fa68a8-9b69-40dc-a614-a7d85a9473f8-kube-api-access-wsmc2\") pod \"horizon-operator-controller-manager-8464cc45fb-z9fqs\" (UID: \"c9fa68a8-9b69-40dc-a614-a7d85a9473f8\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.809178 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tqz8z" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.809258 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l2sn\" (UniqueName: \"kubernetes.io/projected/e45265ca-9523-407d-b93a-16fc26817060-kube-api-access-5l2sn\") pod \"designate-operator-controller-manager-588d4d986b-rwzl9\" (UID: \"e45265ca-9523-407d-b93a-16fc26817060\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.827138 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.828384 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.832878 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6s7bh" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.838744 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.849419 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l2sn\" (UniqueName: \"kubernetes.io/projected/e45265ca-9523-407d-b93a-16fc26817060-kube-api-access-5l2sn\") pod \"designate-operator-controller-manager-588d4d986b-rwzl9\" (UID: \"e45265ca-9523-407d-b93a-16fc26817060\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.881159 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-gccx7"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.882122 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gccx7" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.900221 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.901254 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-tz62l" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.903682 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.905373 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-6n6jl" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.915597 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44xwc\" (UniqueName: \"kubernetes.io/projected/52c17e50-83c9-46ae-8804-aba50e3ff916-kube-api-access-44xwc\") pod \"manila-operator-controller-manager-55f864c847-7hg7s\" (UID: \"52c17e50-83c9-46ae-8804-aba50e3ff916\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.915673 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8bw4\" (UniqueName: \"kubernetes.io/projected/41774efc-3c12-43fb-b4a3-023e5e4811f5-kube-api-access-w8bw4\") pod \"glance-operator-controller-manager-79df6bcc97-f4b4f\" (UID: \"41774efc-3c12-43fb-b4a3-023e5e4811f5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.915698 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7n6d\" (UniqueName: \"kubernetes.io/projected/fbb60c8b-9933-40bd-9a01-3463fa38fd41-kube-api-access-v7n6d\") pod \"keystone-operator-controller-manager-768b96df4c-pgssm\" (UID: \"fbb60c8b-9933-40bd-9a01-3463fa38fd41\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.915720 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.915740 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-478qj\" (UniqueName: \"kubernetes.io/projected/d103b53c-f076-4441-8ff4-c6a3be6ac200-kube-api-access-478qj\") pod \"ironic-operator-controller-manager-6f787dddc9-bbz45\" (UID: \"d103b53c-f076-4441-8ff4-c6a3be6ac200\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.915760 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8jr8\" (UniqueName: \"kubernetes.io/projected/c938f9a1-4273-4d7a-91f1-e430e43ef704-kube-api-access-v8jr8\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.915800 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6dvb\" (UniqueName: \"kubernetes.io/projected/dbc3863b-0f31-47c1-af79-58e6387d5a18-kube-api-access-s6dvb\") pod \"heat-operator-controller-manager-67dd5f86f5-cnjpn\" (UID: \"dbc3863b-0f31-47c1-af79-58e6387d5a18\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.915822 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsmc2\" (UniqueName: \"kubernetes.io/projected/c9fa68a8-9b69-40dc-a614-a7d85a9473f8-kube-api-access-wsmc2\") pod \"horizon-operator-controller-manager-8464cc45fb-z9fqs\" (UID: \"c9fa68a8-9b69-40dc-a614-a7d85a9473f8\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.947193 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsmc2\" (UniqueName: \"kubernetes.io/projected/c9fa68a8-9b69-40dc-a614-a7d85a9473f8-kube-api-access-wsmc2\") pod \"horizon-operator-controller-manager-8464cc45fb-z9fqs\" (UID: \"c9fa68a8-9b69-40dc-a614-a7d85a9473f8\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.949447 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.953969 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-gccx7"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.954083 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8bw4\" (UniqueName: \"kubernetes.io/projected/41774efc-3c12-43fb-b4a3-023e5e4811f5-kube-api-access-w8bw4\") pod \"glance-operator-controller-manager-79df6bcc97-f4b4f\" (UID: \"41774efc-3c12-43fb-b4a3-023e5e4811f5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.954577 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6dvb\" (UniqueName: \"kubernetes.io/projected/dbc3863b-0f31-47c1-af79-58e6387d5a18-kube-api-access-s6dvb\") pod \"heat-operator-controller-manager-67dd5f86f5-cnjpn\" (UID: \"dbc3863b-0f31-47c1-af79-58e6387d5a18\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.969922 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.970639 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.971700 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.979265 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.984047 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.992307 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8t64m" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.993221 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr"] Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.994338 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" Mar 20 08:42:05 crc kubenswrapper[4903]: I0320 08:42:05.996196 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.000661 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-kdrp7" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.000823 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.001901 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.003884 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.004007 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-gnzl9" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.005505 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.019633 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44xwc\" (UniqueName: \"kubernetes.io/projected/52c17e50-83c9-46ae-8804-aba50e3ff916-kube-api-access-44xwc\") pod \"manila-operator-controller-manager-55f864c847-7hg7s\" (UID: \"52c17e50-83c9-46ae-8804-aba50e3ff916\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.019706 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fbh\" (UniqueName: \"kubernetes.io/projected/b5fca5e8-3b5c-49f5-aae9-f13e1fef0111-kube-api-access-99fbh\") pod \"neutron-operator-controller-manager-767865f676-gccx7\" (UID: \"b5fca5e8-3b5c-49f5-aae9-f13e1fef0111\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-gccx7" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.019754 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7n6d\" (UniqueName: \"kubernetes.io/projected/fbb60c8b-9933-40bd-9a01-3463fa38fd41-kube-api-access-v7n6d\") pod \"keystone-operator-controller-manager-768b96df4c-pgssm\" (UID: \"fbb60c8b-9933-40bd-9a01-3463fa38fd41\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.019776 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.019879 4903 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.019896 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-478qj\" (UniqueName: \"kubernetes.io/projected/d103b53c-f076-4441-8ff4-c6a3be6ac200-kube-api-access-478qj\") pod \"ironic-operator-controller-manager-6f787dddc9-bbz45\" (UID: \"d103b53c-f076-4441-8ff4-c6a3be6ac200\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45" Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.019966 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert podName:c938f9a1-4273-4d7a-91f1-e430e43ef704 nodeName:}" failed. No retries permitted until 2026-03-20 08:42:06.519939341 +0000 UTC m=+1151.736839646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert") pod "infra-operator-controller-manager-7b9c774f96-65sn9" (UID: "c938f9a1-4273-4d7a-91f1-e430e43ef704") : secret "infra-operator-webhook-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.019987 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8jr8\" (UniqueName: \"kubernetes.io/projected/c938f9a1-4273-4d7a-91f1-e430e43ef704-kube-api-access-v8jr8\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.024554 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-768l5\" (UniqueName: \"kubernetes.io/projected/db9e032d-63e4-44e3-99d6-55c13c900127-kube-api-access-768l5\") pod \"nova-operator-controller-manager-5d488d59fb-rgwvj\" (UID: \"db9e032d-63e4-44e3-99d6-55c13c900127\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.024631 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj2vx\" (UniqueName: \"kubernetes.io/projected/a560c084-9049-431f-94bc-60bd2639b801-kube-api-access-cj2vx\") pod \"mariadb-operator-controller-manager-67ccfc9778-lhp57\" (UID: \"a560c084-9049-431f-94bc-60bd2639b801\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.039278 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.054259 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7n6d\" (UniqueName: \"kubernetes.io/projected/fbb60c8b-9933-40bd-9a01-3463fa38fd41-kube-api-access-v7n6d\") pod \"keystone-operator-controller-manager-768b96df4c-pgssm\" (UID: \"fbb60c8b-9933-40bd-9a01-3463fa38fd41\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.079337 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-478qj\" (UniqueName: \"kubernetes.io/projected/d103b53c-f076-4441-8ff4-c6a3be6ac200-kube-api-access-478qj\") pod \"ironic-operator-controller-manager-6f787dddc9-bbz45\" (UID: \"d103b53c-f076-4441-8ff4-c6a3be6ac200\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.094545 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44xwc\" (UniqueName: \"kubernetes.io/projected/52c17e50-83c9-46ae-8804-aba50e3ff916-kube-api-access-44xwc\") pod \"manila-operator-controller-manager-55f864c847-7hg7s\" (UID: \"52c17e50-83c9-46ae-8804-aba50e3ff916\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.097448 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.104852 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8jr8\" (UniqueName: \"kubernetes.io/projected/c938f9a1-4273-4d7a-91f1-e430e43ef704-kube-api-access-v8jr8\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.112980 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.118357 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-qjvq2" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.127439 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdzmb\" (UniqueName: \"kubernetes.io/projected/ae26b736-2e3e-4a77-83e1-7df0a04cd02b-kube-api-access-tdzmb\") pod \"ovn-operator-controller-manager-884679f54-fp9lr\" (UID: \"ae26b736-2e3e-4a77-83e1-7df0a04cd02b\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.127503 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-768l5\" (UniqueName: \"kubernetes.io/projected/db9e032d-63e4-44e3-99d6-55c13c900127-kube-api-access-768l5\") pod \"nova-operator-controller-manager-5d488d59fb-rgwvj\" (UID: \"db9e032d-63e4-44e3-99d6-55c13c900127\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.127534 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj2vx\" (UniqueName: \"kubernetes.io/projected/a560c084-9049-431f-94bc-60bd2639b801-kube-api-access-cj2vx\") pod \"mariadb-operator-controller-manager-67ccfc9778-lhp57\" (UID: \"a560c084-9049-431f-94bc-60bd2639b801\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.127606 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.127657 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99fbh\" (UniqueName: \"kubernetes.io/projected/b5fca5e8-3b5c-49f5-aae9-f13e1fef0111-kube-api-access-99fbh\") pod \"neutron-operator-controller-manager-767865f676-gccx7\" (UID: \"b5fca5e8-3b5c-49f5-aae9-f13e1fef0111\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-gccx7" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.127735 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hjjj\" (UniqueName: \"kubernetes.io/projected/dee70365-edd2-44fe-b49e-5b0cd67dd6df-kube-api-access-8hjjj\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.127771 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvtpm\" (UniqueName: \"kubernetes.io/projected/73cea0b7-43b8-491a-b6f3-c8ec8563583f-kube-api-access-fvtpm\") pod \"octavia-operator-controller-manager-5b9f45d989-9sv94\" (UID: \"73cea0b7-43b8-491a-b6f3-c8ec8563583f\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.155232 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.156606 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.160613 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.169426 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-cmp7w" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.169568 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.172067 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fbh\" (UniqueName: \"kubernetes.io/projected/b5fca5e8-3b5c-49f5-aae9-f13e1fef0111-kube-api-access-99fbh\") pod \"neutron-operator-controller-manager-767865f676-gccx7\" (UID: \"b5fca5e8-3b5c-49f5-aae9-f13e1fef0111\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-gccx7" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.173593 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-768l5\" (UniqueName: \"kubernetes.io/projected/db9e032d-63e4-44e3-99d6-55c13c900127-kube-api-access-768l5\") pod \"nova-operator-controller-manager-5d488d59fb-rgwvj\" (UID: \"db9e032d-63e4-44e3-99d6-55c13c900127\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.186919 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.190699 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj2vx\" (UniqueName: \"kubernetes.io/projected/a560c084-9049-431f-94bc-60bd2639b801-kube-api-access-cj2vx\") pod \"mariadb-operator-controller-manager-67ccfc9778-lhp57\" (UID: \"a560c084-9049-431f-94bc-60bd2639b801\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.215764 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.219603 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.228932 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.229058 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hjjj\" (UniqueName: \"kubernetes.io/projected/dee70365-edd2-44fe-b49e-5b0cd67dd6df-kube-api-access-8hjjj\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.229091 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvtpm\" (UniqueName: \"kubernetes.io/projected/73cea0b7-43b8-491a-b6f3-c8ec8563583f-kube-api-access-fvtpm\") pod \"octavia-operator-controller-manager-5b9f45d989-9sv94\" (UID: \"73cea0b7-43b8-491a-b6f3-c8ec8563583f\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.229123 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj4j2\" (UniqueName: \"kubernetes.io/projected/c60316c6-cc32-497c-9955-d38de3103fdc-kube-api-access-vj4j2\") pod \"swift-operator-controller-manager-c674c5965-fxsc4\" (UID: \"c60316c6-cc32-497c-9955-d38de3103fdc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.229146 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztg52\" (UniqueName: \"kubernetes.io/projected/84b169f4-3d59-46de-b955-c8b2de1045f4-kube-api-access-ztg52\") pod \"placement-operator-controller-manager-5784578c99-9n5gb\" (UID: \"84b169f4-3d59-46de-b955-c8b2de1045f4\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.229176 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdzmb\" (UniqueName: \"kubernetes.io/projected/ae26b736-2e3e-4a77-83e1-7df0a04cd02b-kube-api-access-tdzmb\") pod \"ovn-operator-controller-manager-884679f54-fp9lr\" (UID: \"ae26b736-2e3e-4a77-83e1-7df0a04cd02b\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.229662 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.229721 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert podName:dee70365-edd2-44fe-b49e-5b0cd67dd6df nodeName:}" failed. No retries permitted until 2026-03-20 08:42:06.729703305 +0000 UTC m=+1151.946603620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-bsq62" (UID: "dee70365-edd2-44fe-b49e-5b0cd67dd6df") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.241977 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.244922 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.259305 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hjjj\" (UniqueName: \"kubernetes.io/projected/dee70365-edd2-44fe-b49e-5b0cd67dd6df-kube-api-access-8hjjj\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.261671 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdzmb\" (UniqueName: \"kubernetes.io/projected/ae26b736-2e3e-4a77-83e1-7df0a04cd02b-kube-api-access-tdzmb\") pod \"ovn-operator-controller-manager-884679f54-fp9lr\" (UID: \"ae26b736-2e3e-4a77-83e1-7df0a04cd02b\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.261745 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.262852 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.269177 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gccx7" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.281171 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-l2ns4" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.287300 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvtpm\" (UniqueName: \"kubernetes.io/projected/73cea0b7-43b8-491a-b6f3-c8ec8563583f-kube-api-access-fvtpm\") pod \"octavia-operator-controller-manager-5b9f45d989-9sv94\" (UID: \"73cea0b7-43b8-491a-b6f3-c8ec8563583f\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.293685 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.306652 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.324227 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.325492 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.334047 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-phz9b" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.334461 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj4j2\" (UniqueName: \"kubernetes.io/projected/c60316c6-cc32-497c-9955-d38de3103fdc-kube-api-access-vj4j2\") pod \"swift-operator-controller-manager-c674c5965-fxsc4\" (UID: \"c60316c6-cc32-497c-9955-d38de3103fdc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.334509 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztg52\" (UniqueName: \"kubernetes.io/projected/84b169f4-3d59-46de-b955-c8b2de1045f4-kube-api-access-ztg52\") pod \"placement-operator-controller-manager-5784578c99-9n5gb\" (UID: \"84b169f4-3d59-46de-b955-c8b2de1045f4\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.334553 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sljq\" (UniqueName: \"kubernetes.io/projected/8ed79f48-8d64-4133-9a1f-1aad870f1767-kube-api-access-9sljq\") pod \"telemetry-operator-controller-manager-d6b694c5-77bp7\" (UID: \"8ed79f48-8d64-4133-9a1f-1aad870f1767\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.334756 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.338748 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.372227 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztg52\" (UniqueName: \"kubernetes.io/projected/84b169f4-3d59-46de-b955-c8b2de1045f4-kube-api-access-ztg52\") pod \"placement-operator-controller-manager-5784578c99-9n5gb\" (UID: \"84b169f4-3d59-46de-b955-c8b2de1045f4\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.374382 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.378086 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.383400 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.385740 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-g27zj" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.388187 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.397407 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj4j2\" (UniqueName: \"kubernetes.io/projected/c60316c6-cc32-497c-9955-d38de3103fdc-kube-api-access-vj4j2\") pod \"swift-operator-controller-manager-c674c5965-fxsc4\" (UID: \"c60316c6-cc32-497c-9955-d38de3103fdc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.397484 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.419441 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.420729 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.425375 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.425504 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-pzftp" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.426986 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.429578 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.435461 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.435689 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sljq\" (UniqueName: \"kubernetes.io/projected/8ed79f48-8d64-4133-9a1f-1aad870f1767-kube-api-access-9sljq\") pod \"telemetry-operator-controller-manager-d6b694c5-77bp7\" (UID: \"8ed79f48-8d64-4133-9a1f-1aad870f1767\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.435809 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw6wp\" (UniqueName: \"kubernetes.io/projected/79ef28ec-b069-4ee3-947b-92a5605c8d73-kube-api-access-mw6wp\") pod \"test-operator-controller-manager-5c5cb9c4d7-lntlm\" (UID: \"79ef28ec-b069-4ee3-947b-92a5605c8d73\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.435855 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txvs2\" (UniqueName: \"kubernetes.io/projected/a213486d-f613-4f65-a866-9e6bc349a1a9-kube-api-access-txvs2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-56pwc\" (UID: \"a213486d-f613-4f65-a866-9e6bc349a1a9\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.436904 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.442209 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-8hnxg" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.455196 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.458505 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sljq\" (UniqueName: \"kubernetes.io/projected/8ed79f48-8d64-4133-9a1f-1aad870f1767-kube-api-access-9sljq\") pod \"telemetry-operator-controller-manager-d6b694c5-77bp7\" (UID: \"8ed79f48-8d64-4133-9a1f-1aad870f1767\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.462377 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.496432 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.539155 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.539233 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg7mg\" (UniqueName: \"kubernetes.io/projected/34a31f63-98e3-445f-a11c-92a0fb057a4b-kube-api-access-cg7mg\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.539255 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.539298 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.539321 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kptw\" (UniqueName: \"kubernetes.io/projected/fc915c58-af04-4aac-81d9-43d88136f7df-kube-api-access-8kptw\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6hjsx\" (UID: \"fc915c58-af04-4aac-81d9-43d88136f7df\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.539378 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw6wp\" (UniqueName: \"kubernetes.io/projected/79ef28ec-b069-4ee3-947b-92a5605c8d73-kube-api-access-mw6wp\") pod \"test-operator-controller-manager-5c5cb9c4d7-lntlm\" (UID: \"79ef28ec-b069-4ee3-947b-92a5605c8d73\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.539425 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txvs2\" (UniqueName: \"kubernetes.io/projected/a213486d-f613-4f65-a866-9e6bc349a1a9-kube-api-access-txvs2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-56pwc\" (UID: \"a213486d-f613-4f65-a866-9e6bc349a1a9\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc" Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.540269 4903 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.540363 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert podName:c938f9a1-4273-4d7a-91f1-e430e43ef704 nodeName:}" failed. No retries permitted until 2026-03-20 08:42:07.540337327 +0000 UTC m=+1152.757237642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert") pod "infra-operator-controller-manager-7b9c774f96-65sn9" (UID: "c938f9a1-4273-4d7a-91f1-e430e43ef704") : secret "infra-operator-webhook-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.566082 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.566189 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txvs2\" (UniqueName: \"kubernetes.io/projected/a213486d-f613-4f65-a866-9e6bc349a1a9-kube-api-access-txvs2\") pod \"watcher-operator-controller-manager-6c4d75f7f9-56pwc\" (UID: \"a213486d-f613-4f65-a866-9e6bc349a1a9\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.569017 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw6wp\" (UniqueName: \"kubernetes.io/projected/79ef28ec-b069-4ee3-947b-92a5605c8d73-kube-api-access-mw6wp\") pod \"test-operator-controller-manager-5c5cb9c4d7-lntlm\" (UID: \"79ef28ec-b069-4ee3-947b-92a5605c8d73\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.616457 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.641925 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg7mg\" (UniqueName: \"kubernetes.io/projected/34a31f63-98e3-445f-a11c-92a0fb057a4b-kube-api-access-cg7mg\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.641977 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.642022 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.642063 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kptw\" (UniqueName: \"kubernetes.io/projected/fc915c58-af04-4aac-81d9-43d88136f7df-kube-api-access-8kptw\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6hjsx\" (UID: \"fc915c58-af04-4aac-81d9-43d88136f7df\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.642658 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.642753 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs podName:34a31f63-98e3-445f-a11c-92a0fb057a4b nodeName:}" failed. No retries permitted until 2026-03-20 08:42:07.142720722 +0000 UTC m=+1152.359621147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs") pod "openstack-operator-controller-manager-85788d4595-jxnj2" (UID: "34a31f63-98e3-445f-a11c-92a0fb057a4b") : secret "webhook-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.643089 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.643149 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs podName:34a31f63-98e3-445f-a11c-92a0fb057a4b nodeName:}" failed. No retries permitted until 2026-03-20 08:42:07.143126182 +0000 UTC m=+1152.360026497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs") pod "openstack-operator-controller-manager-85788d4595-jxnj2" (UID: "34a31f63-98e3-445f-a11c-92a0fb057a4b") : secret "metrics-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.702899 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kptw\" (UniqueName: \"kubernetes.io/projected/fc915c58-af04-4aac-81d9-43d88136f7df-kube-api-access-8kptw\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6hjsx\" (UID: \"fc915c58-af04-4aac-81d9-43d88136f7df\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.712187 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.730968 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs" event={"ID":"246db1f4-c0bc-4152-9275-dec8e8ca6233","Type":"ContainerStarted","Data":"bd3dc7e7dba1e924d75ffeccf9e0f887fa7a59b83e67e67ca570b385c5a14c57"} Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.746769 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.748624 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: E0320 08:42:06.764730 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert podName:dee70365-edd2-44fe-b49e-5b0cd67dd6df nodeName:}" failed. No retries permitted until 2026-03-20 08:42:07.764680086 +0000 UTC m=+1152.981580401 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-bsq62" (UID: "dee70365-edd2-44fe-b49e-5b0cd67dd6df") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.779235 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg7mg\" (UniqueName: \"kubernetes.io/projected/34a31f63-98e3-445f-a11c-92a0fb057a4b-kube-api-access-cg7mg\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.796090 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.840740 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.841257 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.960978 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.969865 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs"] Mar 20 08:42:06 crc kubenswrapper[4903]: I0320 08:42:06.998823 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f"] Mar 20 08:42:07 crc kubenswrapper[4903]: W0320 08:42:07.015875 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41774efc_3c12_43fb_b4a3_023e5e4811f5.slice/crio-c871c7b5aa3cb8725a394647ea16e866856bf812b804798c6afd8223cbc006f6 WatchSource:0}: Error finding container c871c7b5aa3cb8725a394647ea16e866856bf812b804798c6afd8223cbc006f6: Status 404 returned error can't find the container with id c871c7b5aa3cb8725a394647ea16e866856bf812b804798c6afd8223cbc006f6 Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.175114 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.175187 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.175348 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.175413 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs podName:34a31f63-98e3-445f-a11c-92a0fb057a4b nodeName:}" failed. No retries permitted until 2026-03-20 08:42:08.175389025 +0000 UTC m=+1153.392289340 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs") pod "openstack-operator-controller-manager-85788d4595-jxnj2" (UID: "34a31f63-98e3-445f-a11c-92a0fb057a4b") : secret "metrics-server-cert" not found Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.175863 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.175977 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs podName:34a31f63-98e3-445f-a11c-92a0fb057a4b nodeName:}" failed. No retries permitted until 2026-03-20 08:42:08.175946308 +0000 UTC m=+1153.392846623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs") pod "openstack-operator-controller-manager-85788d4595-jxnj2" (UID: "34a31f63-98e3-445f-a11c-92a0fb057a4b") : secret "webhook-server-cert" not found Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.241874 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.246976 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.536150 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.543007 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.591363 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.592744 4903 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.592805 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert podName:c938f9a1-4273-4d7a-91f1-e430e43ef704 nodeName:}" failed. No retries permitted until 2026-03-20 08:42:09.592785851 +0000 UTC m=+1154.809686166 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert") pod "infra-operator-controller-manager-7b9c774f96-65sn9" (UID: "c938f9a1-4273-4d7a-91f1-e430e43ef704") : secret "infra-operator-webhook-server-cert" not found Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.594242 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-gccx7"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.681899 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.715405 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.731732 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.756139 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" event={"ID":"c9fa68a8-9b69-40dc-a614-a7d85a9473f8","Type":"ContainerStarted","Data":"7cbf827226caf67586cde225f9a3426ae748a7f0e9e494e29e4b36d87bdce570"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.764853 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45" event={"ID":"d103b53c-f076-4441-8ff4-c6a3be6ac200","Type":"ContainerStarted","Data":"36e52f87b37c6a3d071de97461c2410c0cc331702f65246a550276aaa980a7fd"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.770248 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" event={"ID":"dbc3863b-0f31-47c1-af79-58e6387d5a18","Type":"ContainerStarted","Data":"c12d95f88b2a634978e57fdb959f25ee40389f25318a2ebf37a895de4477f45a"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.775237 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57" event={"ID":"a560c084-9049-431f-94bc-60bd2639b801","Type":"ContainerStarted","Data":"785dfd02e1750820c5ce342bea85d6879816545311b853290eba00b6f1e62a3f"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.777065 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" event={"ID":"db9e032d-63e4-44e3-99d6-55c13c900127","Type":"ContainerStarted","Data":"fe20cd11750c33e32753d610a7aa55b035299f3b4f79a561d5a2acaf234d8d80"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.778093 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" event={"ID":"e45265ca-9523-407d-b93a-16fc26817060","Type":"ContainerStarted","Data":"b24a75bd598564c40261fc8c616611a2fd3832081b35b49b34cdfb01a7f7eb65"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.780897 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" event={"ID":"ce3b9536-0686-4544-9ddb-c8e197b5d24a","Type":"ContainerStarted","Data":"bd889a538abb2f0d0735813c3da14cf0bbdf0da8b8930fc9a52ecdadade2d451"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.786212 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94" event={"ID":"73cea0b7-43b8-491a-b6f3-c8ec8563583f","Type":"ContainerStarted","Data":"d40bed57f300ef5f2868b7eabe48419b1b05b3c90c8b9dbe94b94d0b76bbdb6b"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.790229 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f" event={"ID":"41774efc-3c12-43fb-b4a3-023e5e4811f5","Type":"ContainerStarted","Data":"c871c7b5aa3cb8725a394647ea16e866856bf812b804798c6afd8223cbc006f6"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.801433 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.801666 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.801728 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert podName:dee70365-edd2-44fe-b49e-5b0cd67dd6df nodeName:}" failed. No retries permitted until 2026-03-20 08:42:09.801711745 +0000 UTC m=+1155.018612050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-bsq62" (UID: "dee70365-edd2-44fe-b49e-5b0cd67dd6df") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.824147 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.836681 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.836722 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" event={"ID":"fbb60c8b-9933-40bd-9a01-3463fa38fd41","Type":"ContainerStarted","Data":"49f8f96e95e65453cce14467c912c741796bc25305084b7d8ebe75c1c5e12d38"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.842399 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gccx7" event={"ID":"b5fca5e8-3b5c-49f5-aae9-f13e1fef0111","Type":"ContainerStarted","Data":"861adc8cd2010dffd6ccd3e693119adc2e6663ac528aa7eca792f37460ca1b50"} Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.849527 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc"] Mar 20 08:42:07 crc kubenswrapper[4903]: W0320 08:42:07.852296 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc60316c6_cc32_497c_9955_d38de3103fdc.slice/crio-1d100bfbeee2895887dac9d4c1d5155da9f0fea71621b80461ebeef6bfa200fc WatchSource:0}: Error finding container 1d100bfbeee2895887dac9d4c1d5155da9f0fea71621b80461ebeef6bfa200fc: Status 404 returned error can't find the container with id 1d100bfbeee2895887dac9d4c1d5155da9f0fea71621b80461ebeef6bfa200fc Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.854234 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8kptw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-6hjsx_openstack-operators(fc915c58-af04-4aac-81d9-43d88136f7df): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.855001 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vj4j2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-fxsc4_openstack-operators(c60316c6-cc32-497c-9955-d38de3103fdc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.855604 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" podUID="fc915c58-af04-4aac-81d9-43d88136f7df" Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.857851 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" podUID="c60316c6-cc32-497c-9955-d38de3103fdc" Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.861629 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.866560 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4"] Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.870360 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tdzmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-fp9lr_openstack-operators(ae26b736-2e3e-4a77-83e1-7df0a04cd02b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.872234 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" podUID="ae26b736-2e3e-4a77-83e1-7df0a04cd02b" Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.948275 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm"] Mar 20 08:42:07 crc kubenswrapper[4903]: I0320 08:42:07.957410 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7"] Mar 20 08:42:07 crc kubenswrapper[4903]: W0320 08:42:07.966872 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ef28ec_b069_4ee3_947b_92a5605c8d73.slice/crio-b07d5ca9f724c7da45e7dcf10c1655b455361b051b66fe272a2d0711234f7d78 WatchSource:0}: Error finding container b07d5ca9f724c7da45e7dcf10c1655b455361b051b66fe272a2d0711234f7d78: Status 404 returned error can't find the container with id b07d5ca9f724c7da45e7dcf10c1655b455361b051b66fe272a2d0711234f7d78 Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.971189 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mw6wp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-lntlm_openstack-operators(79ef28ec-b069-4ee3-947b-92a5605c8d73): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 08:42:07 crc kubenswrapper[4903]: E0320 08:42:07.972907 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" podUID="79ef28ec-b069-4ee3-947b-92a5605c8d73" Mar 20 08:42:08 crc kubenswrapper[4903]: I0320 08:42:08.213750 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:08 crc kubenswrapper[4903]: I0320 08:42:08.213960 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:08 crc kubenswrapper[4903]: E0320 08:42:08.214015 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 08:42:08 crc kubenswrapper[4903]: E0320 08:42:08.214192 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 08:42:08 crc kubenswrapper[4903]: E0320 08:42:08.214203 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs podName:34a31f63-98e3-445f-a11c-92a0fb057a4b nodeName:}" failed. No retries permitted until 2026-03-20 08:42:10.214131327 +0000 UTC m=+1155.431031642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs") pod "openstack-operator-controller-manager-85788d4595-jxnj2" (UID: "34a31f63-98e3-445f-a11c-92a0fb057a4b") : secret "metrics-server-cert" not found Mar 20 08:42:08 crc kubenswrapper[4903]: E0320 08:42:08.214301 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs podName:34a31f63-98e3-445f-a11c-92a0fb057a4b nodeName:}" failed. No retries permitted until 2026-03-20 08:42:10.214266981 +0000 UTC m=+1155.431167316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs") pod "openstack-operator-controller-manager-85788d4595-jxnj2" (UID: "34a31f63-98e3-445f-a11c-92a0fb057a4b") : secret "webhook-server-cert" not found Mar 20 08:42:08 crc kubenswrapper[4903]: I0320 08:42:08.852575 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc" event={"ID":"a213486d-f613-4f65-a866-9e6bc349a1a9","Type":"ContainerStarted","Data":"716669d0143f57663db90d30978f1e9274c117c27388a35704c546d29496bc3d"} Mar 20 08:42:08 crc kubenswrapper[4903]: I0320 08:42:08.856827 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb" event={"ID":"84b169f4-3d59-46de-b955-c8b2de1045f4","Type":"ContainerStarted","Data":"dbcf2d719b19256a4ecc51ed7daef67980d618200694edf0345a7bd10638fe42"} Mar 20 08:42:08 crc kubenswrapper[4903]: I0320 08:42:08.866529 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" event={"ID":"79ef28ec-b069-4ee3-947b-92a5605c8d73","Type":"ContainerStarted","Data":"b07d5ca9f724c7da45e7dcf10c1655b455361b051b66fe272a2d0711234f7d78"} Mar 20 08:42:08 crc kubenswrapper[4903]: E0320 08:42:08.870442 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" podUID="79ef28ec-b069-4ee3-947b-92a5605c8d73" Mar 20 08:42:08 crc kubenswrapper[4903]: I0320 08:42:08.880822 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" event={"ID":"ae26b736-2e3e-4a77-83e1-7df0a04cd02b","Type":"ContainerStarted","Data":"97e4764d993dc172759111987303bca09ec0e144c7927468ab8a4eb9840909a3"} Mar 20 08:42:08 crc kubenswrapper[4903]: I0320 08:42:08.884054 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" event={"ID":"fc915c58-af04-4aac-81d9-43d88136f7df","Type":"ContainerStarted","Data":"b5f46800fc1214d42827cad7f550f3170a3017a074a67e2a1abf57878e789c59"} Mar 20 08:42:08 crc kubenswrapper[4903]: E0320 08:42:08.885702 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" podUID="ae26b736-2e3e-4a77-83e1-7df0a04cd02b" Mar 20 08:42:08 crc kubenswrapper[4903]: I0320 08:42:08.889274 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" event={"ID":"c60316c6-cc32-497c-9955-d38de3103fdc","Type":"ContainerStarted","Data":"1d100bfbeee2895887dac9d4c1d5155da9f0fea71621b80461ebeef6bfa200fc"} Mar 20 08:42:08 crc kubenswrapper[4903]: E0320 08:42:08.889798 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" podUID="fc915c58-af04-4aac-81d9-43d88136f7df" Mar 20 08:42:08 crc kubenswrapper[4903]: I0320 08:42:08.892969 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" event={"ID":"8ed79f48-8d64-4133-9a1f-1aad870f1767","Type":"ContainerStarted","Data":"0a48ae9e0e4e16427c2ed376eb2144f92fd35c2a65420fec116fa22dce21b7f4"} Mar 20 08:42:08 crc kubenswrapper[4903]: E0320 08:42:08.894984 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" podUID="c60316c6-cc32-497c-9955-d38de3103fdc" Mar 20 08:42:08 crc kubenswrapper[4903]: I0320 08:42:08.896930 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s" event={"ID":"52c17e50-83c9-46ae-8804-aba50e3ff916","Type":"ContainerStarted","Data":"14d7346807a6a51f34211148221fbe89ceda03a2773feaa06d5c7458f595d0f2"} Mar 20 08:42:09 crc kubenswrapper[4903]: I0320 08:42:09.641650 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:09 crc kubenswrapper[4903]: E0320 08:42:09.641908 4903 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 08:42:09 crc kubenswrapper[4903]: E0320 08:42:09.641976 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert podName:c938f9a1-4273-4d7a-91f1-e430e43ef704 nodeName:}" failed. No retries permitted until 2026-03-20 08:42:13.641955149 +0000 UTC m=+1158.858855464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert") pod "infra-operator-controller-manager-7b9c774f96-65sn9" (UID: "c938f9a1-4273-4d7a-91f1-e430e43ef704") : secret "infra-operator-webhook-server-cert" not found Mar 20 08:42:09 crc kubenswrapper[4903]: I0320 08:42:09.844761 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:09 crc kubenswrapper[4903]: E0320 08:42:09.844969 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 08:42:09 crc kubenswrapper[4903]: E0320 08:42:09.845053 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert podName:dee70365-edd2-44fe-b49e-5b0cd67dd6df nodeName:}" failed. No retries permitted until 2026-03-20 08:42:13.845018777 +0000 UTC m=+1159.061919092 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-bsq62" (UID: "dee70365-edd2-44fe-b49e-5b0cd67dd6df") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 08:42:09 crc kubenswrapper[4903]: E0320 08:42:09.923950 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" podUID="ae26b736-2e3e-4a77-83e1-7df0a04cd02b" Mar 20 08:42:09 crc kubenswrapper[4903]: E0320 08:42:09.924316 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" podUID="fc915c58-af04-4aac-81d9-43d88136f7df" Mar 20 08:42:09 crc kubenswrapper[4903]: E0320 08:42:09.924531 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" podUID="79ef28ec-b069-4ee3-947b-92a5605c8d73" Mar 20 08:42:09 crc kubenswrapper[4903]: E0320 08:42:09.927921 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" podUID="c60316c6-cc32-497c-9955-d38de3103fdc" Mar 20 08:42:10 crc kubenswrapper[4903]: I0320 08:42:10.252606 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:10 crc kubenswrapper[4903]: E0320 08:42:10.252895 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 08:42:10 crc kubenswrapper[4903]: E0320 08:42:10.253122 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs podName:34a31f63-98e3-445f-a11c-92a0fb057a4b nodeName:}" failed. No retries permitted until 2026-03-20 08:42:14.25309577 +0000 UTC m=+1159.469996085 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs") pod "openstack-operator-controller-manager-85788d4595-jxnj2" (UID: "34a31f63-98e3-445f-a11c-92a0fb057a4b") : secret "webhook-server-cert" not found Mar 20 08:42:10 crc kubenswrapper[4903]: E0320 08:42:10.253217 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 08:42:10 crc kubenswrapper[4903]: E0320 08:42:10.253343 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs podName:34a31f63-98e3-445f-a11c-92a0fb057a4b nodeName:}" failed. No retries permitted until 2026-03-20 08:42:14.253318376 +0000 UTC m=+1159.470218691 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs") pod "openstack-operator-controller-manager-85788d4595-jxnj2" (UID: "34a31f63-98e3-445f-a11c-92a0fb057a4b") : secret "metrics-server-cert" not found Mar 20 08:42:10 crc kubenswrapper[4903]: I0320 08:42:10.253459 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:13 crc kubenswrapper[4903]: I0320 08:42:13.730704 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:13 crc kubenswrapper[4903]: E0320 08:42:13.730896 4903 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 08:42:13 crc kubenswrapper[4903]: E0320 08:42:13.731384 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert podName:c938f9a1-4273-4d7a-91f1-e430e43ef704 nodeName:}" failed. No retries permitted until 2026-03-20 08:42:21.731363211 +0000 UTC m=+1166.948263526 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert") pod "infra-operator-controller-manager-7b9c774f96-65sn9" (UID: "c938f9a1-4273-4d7a-91f1-e430e43ef704") : secret "infra-operator-webhook-server-cert" not found Mar 20 08:42:13 crc kubenswrapper[4903]: I0320 08:42:13.934771 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:13 crc kubenswrapper[4903]: E0320 08:42:13.934979 4903 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 08:42:13 crc kubenswrapper[4903]: E0320 08:42:13.935106 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert podName:dee70365-edd2-44fe-b49e-5b0cd67dd6df nodeName:}" failed. No retries permitted until 2026-03-20 08:42:21.935085336 +0000 UTC m=+1167.151985651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-bsq62" (UID: "dee70365-edd2-44fe-b49e-5b0cd67dd6df") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 08:42:14 crc kubenswrapper[4903]: I0320 08:42:14.341239 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:14 crc kubenswrapper[4903]: I0320 08:42:14.341407 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:14 crc kubenswrapper[4903]: E0320 08:42:14.341563 4903 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 08:42:14 crc kubenswrapper[4903]: E0320 08:42:14.341586 4903 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 08:42:14 crc kubenswrapper[4903]: E0320 08:42:14.341712 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs podName:34a31f63-98e3-445f-a11c-92a0fb057a4b nodeName:}" failed. No retries permitted until 2026-03-20 08:42:22.341672622 +0000 UTC m=+1167.558572967 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs") pod "openstack-operator-controller-manager-85788d4595-jxnj2" (UID: "34a31f63-98e3-445f-a11c-92a0fb057a4b") : secret "metrics-server-cert" not found Mar 20 08:42:14 crc kubenswrapper[4903]: E0320 08:42:14.341849 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs podName:34a31f63-98e3-445f-a11c-92a0fb057a4b nodeName:}" failed. No retries permitted until 2026-03-20 08:42:22.341790615 +0000 UTC m=+1167.558690970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs") pod "openstack-operator-controller-manager-85788d4595-jxnj2" (UID: "34a31f63-98e3-445f-a11c-92a0fb057a4b") : secret "webhook-server-cert" not found Mar 20 08:42:19 crc kubenswrapper[4903]: E0320 08:42:19.421327 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900" Mar 20 08:42:19 crc kubenswrapper[4903]: E0320 08:42:19.422202 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s6dvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-cnjpn_openstack-operators(dbc3863b-0f31-47c1-af79-58e6387d5a18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:42:19 crc kubenswrapper[4903]: E0320 08:42:19.423721 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" podUID="dbc3863b-0f31-47c1-af79-58e6387d5a18" Mar 20 08:42:20 crc kubenswrapper[4903]: E0320 08:42:20.027435 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" podUID="dbc3863b-0f31-47c1-af79-58e6387d5a18" Mar 20 08:42:20 crc kubenswrapper[4903]: E0320 08:42:20.153941 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113" Mar 20 08:42:20 crc kubenswrapper[4903]: E0320 08:42:20.154271 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wsmc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-z9fqs_openstack-operators(c9fa68a8-9b69-40dc-a614-a7d85a9473f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:42:20 crc kubenswrapper[4903]: E0320 08:42:20.156192 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" podUID="c9fa68a8-9b69-40dc-a614-a7d85a9473f8" Mar 20 08:42:20 crc kubenswrapper[4903]: I0320 08:42:20.493711 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:42:20 crc kubenswrapper[4903]: E0320 08:42:20.782139 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444" Mar 20 08:42:20 crc kubenswrapper[4903]: E0320 08:42:20.782365 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9sljq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-77bp7_openstack-operators(8ed79f48-8d64-4133-9a1f-1aad870f1767): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:42:20 crc kubenswrapper[4903]: E0320 08:42:20.785215 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" podUID="8ed79f48-8d64-4133-9a1f-1aad870f1767" Mar 20 08:42:20 crc kubenswrapper[4903]: I0320 08:42:20.833364 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:42:20 crc kubenswrapper[4903]: I0320 08:42:20.833481 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:42:20 crc kubenswrapper[4903]: I0320 08:42:20.833597 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:42:20 crc kubenswrapper[4903]: I0320 08:42:20.834483 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c24277c1ea9806e81aa4981e7afee5bd67d933c16e4a264dbdf97f39e69ac1c"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:42:20 crc kubenswrapper[4903]: I0320 08:42:20.834549 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://0c24277c1ea9806e81aa4981e7afee5bd67d933c16e4a264dbdf97f39e69ac1c" gracePeriod=600 Mar 20 08:42:21 crc kubenswrapper[4903]: I0320 08:42:21.025505 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="0c24277c1ea9806e81aa4981e7afee5bd67d933c16e4a264dbdf97f39e69ac1c" exitCode=0 Mar 20 08:42:21 crc kubenswrapper[4903]: I0320 08:42:21.025588 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"0c24277c1ea9806e81aa4981e7afee5bd67d933c16e4a264dbdf97f39e69ac1c"} Mar 20 08:42:21 crc kubenswrapper[4903]: I0320 08:42:21.025653 4903 scope.go:117] "RemoveContainer" containerID="5d4eaa665a94ad4629d1882b24e933477ef11d2d020096b0cd7d0be400cb4301" Mar 20 08:42:21 crc kubenswrapper[4903]: E0320 08:42:21.028240 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" podUID="c9fa68a8-9b69-40dc-a614-a7d85a9473f8" Mar 20 08:42:21 crc kubenswrapper[4903]: E0320 08:42:21.028738 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" podUID="8ed79f48-8d64-4133-9a1f-1aad870f1767" Mar 20 08:42:21 crc kubenswrapper[4903]: E0320 08:42:21.459150 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad" Mar 20 08:42:21 crc kubenswrapper[4903]: E0320 08:42:21.459381 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5l2sn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-rwzl9_openstack-operators(e45265ca-9523-407d-b93a-16fc26817060): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:42:21 crc kubenswrapper[4903]: E0320 08:42:21.460823 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" podUID="e45265ca-9523-407d-b93a-16fc26817060" Mar 20 08:42:21 crc kubenswrapper[4903]: I0320 08:42:21.777204 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:21 crc kubenswrapper[4903]: I0320 08:42:21.784565 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938f9a1-4273-4d7a-91f1-e430e43ef704-cert\") pod \"infra-operator-controller-manager-7b9c774f96-65sn9\" (UID: \"c938f9a1-4273-4d7a-91f1-e430e43ef704\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:21 crc kubenswrapper[4903]: I0320 08:42:21.960459 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:21 crc kubenswrapper[4903]: I0320 08:42:21.981068 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:21 crc kubenswrapper[4903]: I0320 08:42:21.984817 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dee70365-edd2-44fe-b49e-5b0cd67dd6df-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-bsq62\" (UID: \"dee70365-edd2-44fe-b49e-5b0cd67dd6df\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:22 crc kubenswrapper[4903]: I0320 08:42:22.023197 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:22 crc kubenswrapper[4903]: E0320 08:42:22.042829 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" podUID="e45265ca-9523-407d-b93a-16fc26817060" Mar 20 08:42:22 crc kubenswrapper[4903]: I0320 08:42:22.389092 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:22 crc kubenswrapper[4903]: I0320 08:42:22.389332 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:22 crc kubenswrapper[4903]: I0320 08:42:22.394846 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-metrics-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:22 crc kubenswrapper[4903]: I0320 08:42:22.406012 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34a31f63-98e3-445f-a11c-92a0fb057a4b-webhook-certs\") pod \"openstack-operator-controller-manager-85788d4595-jxnj2\" (UID: \"34a31f63-98e3-445f-a11c-92a0fb057a4b\") " pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:22 crc kubenswrapper[4903]: I0320 08:42:22.430453 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:23 crc kubenswrapper[4903]: E0320 08:42:23.683601 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d" Mar 20 08:42:23 crc kubenswrapper[4903]: E0320 08:42:23.684245 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mh67l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-59bc569d95-kzflj_openstack-operators(ce3b9536-0686-4544-9ddb-c8e197b5d24a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:42:23 crc kubenswrapper[4903]: E0320 08:42:23.687474 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" podUID="ce3b9536-0686-4544-9ddb-c8e197b5d24a" Mar 20 08:42:24 crc kubenswrapper[4903]: E0320 08:42:24.064312 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" podUID="ce3b9536-0686-4544-9ddb-c8e197b5d24a" Mar 20 08:42:24 crc kubenswrapper[4903]: E0320 08:42:24.412129 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 08:42:24 crc kubenswrapper[4903]: E0320 08:42:24.412452 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v7n6d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-pgssm_openstack-operators(fbb60c8b-9933-40bd-9a01-3463fa38fd41): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:42:24 crc kubenswrapper[4903]: E0320 08:42:24.413674 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" podUID="fbb60c8b-9933-40bd-9a01-3463fa38fd41" Mar 20 08:42:24 crc kubenswrapper[4903]: E0320 08:42:24.963059 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 08:42:24 crc kubenswrapper[4903]: E0320 08:42:24.963275 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-768l5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-rgwvj_openstack-operators(db9e032d-63e4-44e3-99d6-55c13c900127): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:42:24 crc kubenswrapper[4903]: E0320 08:42:24.964639 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" podUID="db9e032d-63e4-44e3-99d6-55c13c900127" Mar 20 08:42:25 crc kubenswrapper[4903]: E0320 08:42:25.091651 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" podUID="fbb60c8b-9933-40bd-9a01-3463fa38fd41" Mar 20 08:42:25 crc kubenswrapper[4903]: E0320 08:42:25.098828 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" podUID="db9e032d-63e4-44e3-99d6-55c13c900127" Mar 20 08:42:28 crc kubenswrapper[4903]: I0320 08:42:28.892967 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2"] Mar 20 08:42:28 crc kubenswrapper[4903]: W0320 08:42:28.908890 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a31f63_98e3_445f_a11c_92a0fb057a4b.slice/crio-44074ef19d3aca4d6b530858d5c905d9558367aef85c97441f6764e18ae4e622 WatchSource:0}: Error finding container 44074ef19d3aca4d6b530858d5c905d9558367aef85c97441f6764e18ae4e622: Status 404 returned error can't find the container with id 44074ef19d3aca4d6b530858d5c905d9558367aef85c97441f6764e18ae4e622 Mar 20 08:42:28 crc kubenswrapper[4903]: I0320 08:42:28.982849 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9"] Mar 20 08:42:28 crc kubenswrapper[4903]: I0320 08:42:28.992391 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62"] Mar 20 08:42:29 crc kubenswrapper[4903]: W0320 08:42:29.007457 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc938f9a1_4273_4d7a_91f1_e430e43ef704.slice/crio-f49fc88d1b8d9c595c6cea5b091ee20a5dc006f39c75a130e33259314b3d0e92 WatchSource:0}: Error finding container f49fc88d1b8d9c595c6cea5b091ee20a5dc006f39c75a130e33259314b3d0e92: Status 404 returned error can't find the container with id f49fc88d1b8d9c595c6cea5b091ee20a5dc006f39c75a130e33259314b3d0e92 Mar 20 08:42:29 crc kubenswrapper[4903]: W0320 08:42:29.020132 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddee70365_edd2_44fe_b49e_5b0cd67dd6df.slice/crio-413f647f540f108d8796959bdcc9a27aff3ee203c7e43a58342260c80bfab44f WatchSource:0}: Error finding container 413f647f540f108d8796959bdcc9a27aff3ee203c7e43a58342260c80bfab44f: Status 404 returned error can't find the container with id 413f647f540f108d8796959bdcc9a27aff3ee203c7e43a58342260c80bfab44f Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.172625 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57" event={"ID":"a560c084-9049-431f-94bc-60bd2639b801","Type":"ContainerStarted","Data":"367ef6054b00ba100cef2950ffa0bf00c1509493061ba50d6915e6f1271a51b9"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.174165 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.181449 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f" event={"ID":"41774efc-3c12-43fb-b4a3-023e5e4811f5","Type":"ContainerStarted","Data":"f9907c3a262364feb51e18ae15c0659b41528391e6db3e0cb984db1be53d0dbb"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.182256 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.185002 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" event={"ID":"34a31f63-98e3-445f-a11c-92a0fb057a4b","Type":"ContainerStarted","Data":"44074ef19d3aca4d6b530858d5c905d9558367aef85c97441f6764e18ae4e622"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.187465 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs" event={"ID":"246db1f4-c0bc-4152-9275-dec8e8ca6233","Type":"ContainerStarted","Data":"946bc9e197c77efe84822a65f90d26a8d71a1c0098eb6540ab814654acdda1dd"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.187943 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.195622 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" event={"ID":"dee70365-edd2-44fe-b49e-5b0cd67dd6df","Type":"ContainerStarted","Data":"413f647f540f108d8796959bdcc9a27aff3ee203c7e43a58342260c80bfab44f"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.203559 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57" podStartSLOduration=6.480557862 podStartE2EDuration="24.203535705s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.247839063 +0000 UTC m=+1152.464739378" lastFinishedPulling="2026-03-20 08:42:24.970816906 +0000 UTC m=+1170.187717221" observedRunningTime="2026-03-20 08:42:29.201085804 +0000 UTC m=+1174.417986119" watchObservedRunningTime="2026-03-20 08:42:29.203535705 +0000 UTC m=+1174.420436010" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.219698 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gccx7" event={"ID":"b5fca5e8-3b5c-49f5-aae9-f13e1fef0111","Type":"ContainerStarted","Data":"1bf27ec4c330753abde467a58034ee6a9bf057485a3a5dc86348468556c9ab70"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.220565 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gccx7" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.237814 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s" event={"ID":"52c17e50-83c9-46ae-8804-aba50e3ff916","Type":"ContainerStarted","Data":"e070cfdf5140e152e7233d9ad4188af38167efdddc0716b10902bd60dbe5a72c"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.237952 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.240225 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f" podStartSLOduration=6.288270935 podStartE2EDuration="24.240214701s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.019402433 +0000 UTC m=+1152.236302748" lastFinishedPulling="2026-03-20 08:42:24.971346199 +0000 UTC m=+1170.188246514" observedRunningTime="2026-03-20 08:42:29.234940309 +0000 UTC m=+1174.451840624" watchObservedRunningTime="2026-03-20 08:42:29.240214701 +0000 UTC m=+1174.457115016" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.250027 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc" event={"ID":"a213486d-f613-4f65-a866-9e6bc349a1a9","Type":"ContainerStarted","Data":"f5e2e0b0f209ecca048021afa962347f7997cc2a17580bf7ef090e0751bdc1c7"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.251134 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.265726 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs" podStartSLOduration=5.8797406290000005 podStartE2EDuration="24.265708047s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:06.585353631 +0000 UTC m=+1151.802253946" lastFinishedPulling="2026-03-20 08:42:24.971321049 +0000 UTC m=+1170.188221364" observedRunningTime="2026-03-20 08:42:29.261589964 +0000 UTC m=+1174.478490279" watchObservedRunningTime="2026-03-20 08:42:29.265708047 +0000 UTC m=+1174.482608362" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.281489 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" event={"ID":"c938f9a1-4273-4d7a-91f1-e430e43ef704","Type":"ContainerStarted","Data":"f49fc88d1b8d9c595c6cea5b091ee20a5dc006f39c75a130e33259314b3d0e92"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.290179 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gccx7" podStartSLOduration=6.9181100220000005 podStartE2EDuration="24.290154877s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.610801211 +0000 UTC m=+1152.827701526" lastFinishedPulling="2026-03-20 08:42:24.982846066 +0000 UTC m=+1170.199746381" observedRunningTime="2026-03-20 08:42:29.284487476 +0000 UTC m=+1174.501387791" watchObservedRunningTime="2026-03-20 08:42:29.290154877 +0000 UTC m=+1174.507055192" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.305554 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94" event={"ID":"73cea0b7-43b8-491a-b6f3-c8ec8563583f","Type":"ContainerStarted","Data":"1fdf4ccfb27e2bed060de6a10096dd27708fa5e14a0767365984b3254e23c80e"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.305945 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.318740 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" event={"ID":"c60316c6-cc32-497c-9955-d38de3103fdc","Type":"ContainerStarted","Data":"a263aa01e66f74bba03b7d11394f0b5d338f0c3376ff7471356dcb913328e5a9"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.319667 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.342948 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc" podStartSLOduration=6.220792044 podStartE2EDuration="23.342925023s" podCreationTimestamp="2026-03-20 08:42:06 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.849176379 +0000 UTC m=+1153.066076704" lastFinishedPulling="2026-03-20 08:42:24.971309368 +0000 UTC m=+1170.188209683" observedRunningTime="2026-03-20 08:42:29.319440758 +0000 UTC m=+1174.536341073" watchObservedRunningTime="2026-03-20 08:42:29.342925023 +0000 UTC m=+1174.559825338" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.345497 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"7094e5f77c270dc626be780c469f09df2b6b5e5f309bca7fa5e8149bdd6f3199"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.353250 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s" podStartSLOduration=7.213173735 podStartE2EDuration="24.353224991s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.831408756 +0000 UTC m=+1153.048309071" lastFinishedPulling="2026-03-20 08:42:24.971460012 +0000 UTC m=+1170.188360327" observedRunningTime="2026-03-20 08:42:29.351048716 +0000 UTC m=+1174.567949031" watchObservedRunningTime="2026-03-20 08:42:29.353224991 +0000 UTC m=+1174.570125306" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.357141 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45" event={"ID":"d103b53c-f076-4441-8ff4-c6a3be6ac200","Type":"ContainerStarted","Data":"495f5e791c496553bd924ce36862a197323675fb5b3acfc49dd6e5a4af4a5194"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.357925 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.364612 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb" event={"ID":"84b169f4-3d59-46de-b955-c8b2de1045f4","Type":"ContainerStarted","Data":"168833f88c96907cf217b0451ec17bad0dabe55b9aac7130759f148390bec989"} Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.365645 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.395910 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" podStartSLOduration=3.570847838 podStartE2EDuration="24.395881165s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.854862551 +0000 UTC m=+1153.071762866" lastFinishedPulling="2026-03-20 08:42:28.679895868 +0000 UTC m=+1173.896796193" observedRunningTime="2026-03-20 08:42:29.391388433 +0000 UTC m=+1174.608288748" watchObservedRunningTime="2026-03-20 08:42:29.395881165 +0000 UTC m=+1174.612781480" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.462717 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94" podStartSLOduration=7.207629957 podStartE2EDuration="24.462690583s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.714496769 +0000 UTC m=+1152.931397084" lastFinishedPulling="2026-03-20 08:42:24.969557395 +0000 UTC m=+1170.186457710" observedRunningTime="2026-03-20 08:42:29.44215731 +0000 UTC m=+1174.659057635" watchObservedRunningTime="2026-03-20 08:42:29.462690583 +0000 UTC m=+1174.679590898" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.463740 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45" podStartSLOduration=7.103385505 podStartE2EDuration="24.463732149s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.610310319 +0000 UTC m=+1152.827210634" lastFinishedPulling="2026-03-20 08:42:24.970656963 +0000 UTC m=+1170.187557278" observedRunningTime="2026-03-20 08:42:29.459950624 +0000 UTC m=+1174.676850939" watchObservedRunningTime="2026-03-20 08:42:29.463732149 +0000 UTC m=+1174.680632464" Mar 20 08:42:29 crc kubenswrapper[4903]: I0320 08:42:29.492064 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb" podStartSLOduration=7.352135712 podStartE2EDuration="24.492017314s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.831347074 +0000 UTC m=+1153.048247389" lastFinishedPulling="2026-03-20 08:42:24.971228676 +0000 UTC m=+1170.188128991" observedRunningTime="2026-03-20 08:42:29.480684232 +0000 UTC m=+1174.697584557" watchObservedRunningTime="2026-03-20 08:42:29.492017314 +0000 UTC m=+1174.708917629" Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.373661 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" event={"ID":"fc915c58-af04-4aac-81d9-43d88136f7df","Type":"ContainerStarted","Data":"3682806ac9f51cf3997991d331d1e28b5fb556f1f1c6d2f5ec18188dcc597177"} Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.390313 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" event={"ID":"34a31f63-98e3-445f-a11c-92a0fb057a4b","Type":"ContainerStarted","Data":"0169687957cb65ac2d4597b731a306173c1db050382803ce995f12c1714879d9"} Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.390501 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.398148 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" event={"ID":"79ef28ec-b069-4ee3-947b-92a5605c8d73","Type":"ContainerStarted","Data":"e48e63093181394cedc1e85491cb79fb13e6ff0b0c4f150714598a504a1ebf77"} Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.398307 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.400180 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" event={"ID":"ae26b736-2e3e-4a77-83e1-7df0a04cd02b","Type":"ContainerStarted","Data":"68582f8705b81168e750fa51eb16a2a5fd13bf5fe155362ff6477e3267948268"} Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.400549 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.402147 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6hjsx" podStartSLOduration=3.575687006 podStartE2EDuration="24.402137167s" podCreationTimestamp="2026-03-20 08:42:06 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.853945959 +0000 UTC m=+1153.070846274" lastFinishedPulling="2026-03-20 08:42:28.68039612 +0000 UTC m=+1173.897296435" observedRunningTime="2026-03-20 08:42:30.398955218 +0000 UTC m=+1175.615855533" watchObservedRunningTime="2026-03-20 08:42:30.402137167 +0000 UTC m=+1175.619037482" Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.436442 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" podStartSLOduration=4.853555379 podStartE2EDuration="25.436424432s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.970887147 +0000 UTC m=+1153.187787462" lastFinishedPulling="2026-03-20 08:42:28.5537562 +0000 UTC m=+1173.770656515" observedRunningTime="2026-03-20 08:42:30.429021138 +0000 UTC m=+1175.645921453" watchObservedRunningTime="2026-03-20 08:42:30.436424432 +0000 UTC m=+1175.653324737" Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.496518 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" podStartSLOduration=24.496492511 podStartE2EDuration="24.496492511s" podCreationTimestamp="2026-03-20 08:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:42:30.457428076 +0000 UTC m=+1175.674328391" watchObservedRunningTime="2026-03-20 08:42:30.496492511 +0000 UTC m=+1175.713392816" Mar 20 08:42:30 crc kubenswrapper[4903]: I0320 08:42:30.515101 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" podStartSLOduration=4.834906044 podStartE2EDuration="25.515063614s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.870098322 +0000 UTC m=+1153.086998637" lastFinishedPulling="2026-03-20 08:42:28.550255872 +0000 UTC m=+1173.767156207" observedRunningTime="2026-03-20 08:42:30.484453901 +0000 UTC m=+1175.701354216" watchObservedRunningTime="2026-03-20 08:42:30.515063614 +0000 UTC m=+1175.731963929" Mar 20 08:42:31 crc kubenswrapper[4903]: I0320 08:42:31.411381 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" event={"ID":"dbc3863b-0f31-47c1-af79-58e6387d5a18","Type":"ContainerStarted","Data":"30eae5dece6fadf5b4604086919de9889a07d6cd4eda1c48c31fdba3f32df5a2"} Mar 20 08:42:33 crc kubenswrapper[4903]: I0320 08:42:33.425751 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" event={"ID":"dee70365-edd2-44fe-b49e-5b0cd67dd6df","Type":"ContainerStarted","Data":"ba4140b7e045058e434a2d14371b813f8982e5b1719413a63fbc6cf97cbd2d74"} Mar 20 08:42:33 crc kubenswrapper[4903]: I0320 08:42:33.426393 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:33 crc kubenswrapper[4903]: I0320 08:42:33.427806 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" event={"ID":"c938f9a1-4273-4d7a-91f1-e430e43ef704","Type":"ContainerStarted","Data":"f30a7cc770f2b1336ae72a3030351e6cb409de58961b43da84e9297d59683698"} Mar 20 08:42:33 crc kubenswrapper[4903]: I0320 08:42:33.428016 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:33 crc kubenswrapper[4903]: I0320 08:42:33.470168 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" podStartSLOduration=24.909588995 podStartE2EDuration="28.47014828s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:29.022779795 +0000 UTC m=+1174.239680110" lastFinishedPulling="2026-03-20 08:42:32.58333907 +0000 UTC m=+1177.800239395" observedRunningTime="2026-03-20 08:42:33.466250343 +0000 UTC m=+1178.683150668" watchObservedRunningTime="2026-03-20 08:42:33.47014828 +0000 UTC m=+1178.687048605" Mar 20 08:42:33 crc kubenswrapper[4903]: I0320 08:42:33.472601 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" podStartSLOduration=4.609697994 podStartE2EDuration="28.472586731s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.034305364 +0000 UTC m=+1152.251205679" lastFinishedPulling="2026-03-20 08:42:30.897194101 +0000 UTC m=+1176.114094416" observedRunningTime="2026-03-20 08:42:31.434707835 +0000 UTC m=+1176.651608180" watchObservedRunningTime="2026-03-20 08:42:33.472586731 +0000 UTC m=+1178.689487076" Mar 20 08:42:33 crc kubenswrapper[4903]: I0320 08:42:33.493470 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" podStartSLOduration=24.923244697 podStartE2EDuration="28.493446412s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:29.010650732 +0000 UTC m=+1174.227551047" lastFinishedPulling="2026-03-20 08:42:32.580852437 +0000 UTC m=+1177.797752762" observedRunningTime="2026-03-20 08:42:33.486050967 +0000 UTC m=+1178.702951282" watchObservedRunningTime="2026-03-20 08:42:33.493446412 +0000 UTC m=+1178.710346727" Mar 20 08:42:34 crc kubenswrapper[4903]: I0320 08:42:34.447251 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" event={"ID":"e45265ca-9523-407d-b93a-16fc26817060","Type":"ContainerStarted","Data":"46d050f95ce61fed6833e1e542a15b5d02e1658e7ba318ac3339ed265f841b63"} Mar 20 08:42:34 crc kubenswrapper[4903]: I0320 08:42:34.448124 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" Mar 20 08:42:34 crc kubenswrapper[4903]: I0320 08:42:34.475150 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" podStartSLOduration=2.82509041 podStartE2EDuration="29.47511961s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.253418223 +0000 UTC m=+1152.470318538" lastFinishedPulling="2026-03-20 08:42:33.903447413 +0000 UTC m=+1179.120347738" observedRunningTime="2026-03-20 08:42:34.463765727 +0000 UTC m=+1179.680666082" watchObservedRunningTime="2026-03-20 08:42:34.47511961 +0000 UTC m=+1179.692019965" Mar 20 08:42:35 crc kubenswrapper[4903]: I0320 08:42:35.454666 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" event={"ID":"8ed79f48-8d64-4133-9a1f-1aad870f1767","Type":"ContainerStarted","Data":"72481124dd9cdccaeb799956a4571ae47c918f8028a1d0c5a72af45e5d99ba48"} Mar 20 08:42:35 crc kubenswrapper[4903]: I0320 08:42:35.455921 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" Mar 20 08:42:35 crc kubenswrapper[4903]: I0320 08:42:35.480473 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" podStartSLOduration=3.452370472 podStartE2EDuration="30.480454428s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.966218921 +0000 UTC m=+1153.183119236" lastFinishedPulling="2026-03-20 08:42:34.994302877 +0000 UTC m=+1180.211203192" observedRunningTime="2026-03-20 08:42:35.475008202 +0000 UTC m=+1180.691908537" watchObservedRunningTime="2026-03-20 08:42:35.480454428 +0000 UTC m=+1180.697354743" Mar 20 08:42:35 crc kubenswrapper[4903]: I0320 08:42:35.800439 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-9pxrs" Mar 20 08:42:35 crc kubenswrapper[4903]: I0320 08:42:35.976026 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-f4b4f" Mar 20 08:42:35 crc kubenswrapper[4903]: I0320 08:42:35.984167 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.223700 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-7hg7s" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.248486 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-lhp57" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.284482 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-gccx7" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.346906 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-9sv94" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.390170 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-bbz45" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.399956 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fp9lr" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.468114 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" event={"ID":"ce3b9536-0686-4544-9ddb-c8e197b5d24a","Type":"ContainerStarted","Data":"81ed0d6192150f674536e7c0383291694adad4c0a78fec28dbec27ec66e6526b"} Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.471163 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.471718 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9n5gb" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.498197 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" event={"ID":"c9fa68a8-9b69-40dc-a614-a7d85a9473f8","Type":"ContainerStarted","Data":"2bf3ba97b906b563f4779ce415d88044e9e218a4e9b4e945ee2696f97e049639"} Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.499720 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" podStartSLOduration=2.192822061 podStartE2EDuration="31.499683014s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:06.932709989 +0000 UTC m=+1152.149610294" lastFinishedPulling="2026-03-20 08:42:36.239570922 +0000 UTC m=+1181.456471247" observedRunningTime="2026-03-20 08:42:36.494461163 +0000 UTC m=+1181.711361478" watchObservedRunningTime="2026-03-20 08:42:36.499683014 +0000 UTC m=+1181.716583329" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.512301 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fxsc4" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.589511 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" podStartSLOduration=2.688603453 podStartE2EDuration="31.589489855s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.055064952 +0000 UTC m=+1152.271965267" lastFinishedPulling="2026-03-20 08:42:35.955951354 +0000 UTC m=+1181.172851669" observedRunningTime="2026-03-20 08:42:36.580357496 +0000 UTC m=+1181.797257811" watchObservedRunningTime="2026-03-20 08:42:36.589489855 +0000 UTC m=+1181.806390170" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.717577 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-lntlm" Mar 20 08:42:36 crc kubenswrapper[4903]: I0320 08:42:36.801754 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-56pwc" Mar 20 08:42:41 crc kubenswrapper[4903]: I0320 08:42:41.972652 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-65sn9" Mar 20 08:42:42 crc kubenswrapper[4903]: I0320 08:42:42.037955 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-bsq62" Mar 20 08:42:42 crc kubenswrapper[4903]: I0320 08:42:42.437905 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-85788d4595-jxnj2" Mar 20 08:42:45 crc kubenswrapper[4903]: I0320 08:42:45.787652 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kzflj" Mar 20 08:42:45 crc kubenswrapper[4903]: I0320 08:42:45.988837 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-cnjpn" Mar 20 08:42:45 crc kubenswrapper[4903]: I0320 08:42:45.997709 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" Mar 20 08:42:45 crc kubenswrapper[4903]: I0320 08:42:45.999623 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-z9fqs" Mar 20 08:42:46 crc kubenswrapper[4903]: I0320 08:42:46.165587 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-rwzl9" Mar 20 08:42:46 crc kubenswrapper[4903]: I0320 08:42:46.586456 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" event={"ID":"fbb60c8b-9933-40bd-9a01-3463fa38fd41","Type":"ContainerStarted","Data":"f2d31c7fdb24ec070b5a332d20cf51afd1bababc3daec12e14cce9ce144f74de"} Mar 20 08:42:46 crc kubenswrapper[4903]: I0320 08:42:46.587240 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" Mar 20 08:42:46 crc kubenswrapper[4903]: I0320 08:42:46.588042 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" event={"ID":"db9e032d-63e4-44e3-99d6-55c13c900127","Type":"ContainerStarted","Data":"013478b1a0aa3615180c9ef03a6cff224b9133525635b134d5cecce584888a21"} Mar 20 08:42:46 crc kubenswrapper[4903]: I0320 08:42:46.588409 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" Mar 20 08:42:46 crc kubenswrapper[4903]: I0320 08:42:46.607191 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" podStartSLOduration=3.334592194 podStartE2EDuration="41.607170255s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.597227212 +0000 UTC m=+1152.814127527" lastFinishedPulling="2026-03-20 08:42:45.869805273 +0000 UTC m=+1191.086705588" observedRunningTime="2026-03-20 08:42:46.602151719 +0000 UTC m=+1191.819052044" watchObservedRunningTime="2026-03-20 08:42:46.607170255 +0000 UTC m=+1191.824070570" Mar 20 08:42:46 crc kubenswrapper[4903]: I0320 08:42:46.621343 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-77bp7" Mar 20 08:42:46 crc kubenswrapper[4903]: I0320 08:42:46.641865 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" podStartSLOduration=3.491134699 podStartE2EDuration="41.641843519s" podCreationTimestamp="2026-03-20 08:42:05 +0000 UTC" firstStartedPulling="2026-03-20 08:42:07.720246022 +0000 UTC m=+1152.937146337" lastFinishedPulling="2026-03-20 08:42:45.870954822 +0000 UTC m=+1191.087855157" observedRunningTime="2026-03-20 08:42:46.625270576 +0000 UTC m=+1191.842170901" watchObservedRunningTime="2026-03-20 08:42:46.641843519 +0000 UTC m=+1191.858743834" Mar 20 08:42:56 crc kubenswrapper[4903]: I0320 08:42:56.192920 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-pgssm" Mar 20 08:42:56 crc kubenswrapper[4903]: I0320 08:42:56.300518 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rgwvj" Mar 20 08:42:56 crc kubenswrapper[4903]: I0320 08:42:56.985754 4903 scope.go:117] "RemoveContainer" containerID="6cc5f19e3cf9425022c6a178e54158ffb3d2d133ce12c751ac2a935b4699cd8a" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.217759 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jntgm"] Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.219695 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.222603 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.222639 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.222802 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.226425 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jqtz9" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.234596 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jntgm"] Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.251200 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2p77\" (UniqueName: \"kubernetes.io/projected/31d47b29-48d4-4d99-83cd-91e4a383b108-kube-api-access-v2p77\") pod \"dnsmasq-dns-675f4bcbfc-jntgm\" (UID: \"31d47b29-48d4-4d99-83cd-91e4a383b108\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.251292 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d47b29-48d4-4d99-83cd-91e4a383b108-config\") pod \"dnsmasq-dns-675f4bcbfc-jntgm\" (UID: \"31d47b29-48d4-4d99-83cd-91e4a383b108\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.298782 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-59q7v"] Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.300391 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.302183 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.318765 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-59q7v"] Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.353191 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-59q7v\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.353253 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2p77\" (UniqueName: \"kubernetes.io/projected/31d47b29-48d4-4d99-83cd-91e4a383b108-kube-api-access-v2p77\") pod \"dnsmasq-dns-675f4bcbfc-jntgm\" (UID: \"31d47b29-48d4-4d99-83cd-91e4a383b108\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.353292 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cwch\" (UniqueName: \"kubernetes.io/projected/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-kube-api-access-4cwch\") pod \"dnsmasq-dns-78dd6ddcc-59q7v\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.353324 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-config\") pod \"dnsmasq-dns-78dd6ddcc-59q7v\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.353363 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d47b29-48d4-4d99-83cd-91e4a383b108-config\") pod \"dnsmasq-dns-675f4bcbfc-jntgm\" (UID: \"31d47b29-48d4-4d99-83cd-91e4a383b108\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.354358 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d47b29-48d4-4d99-83cd-91e4a383b108-config\") pod \"dnsmasq-dns-675f4bcbfc-jntgm\" (UID: \"31d47b29-48d4-4d99-83cd-91e4a383b108\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.379083 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2p77\" (UniqueName: \"kubernetes.io/projected/31d47b29-48d4-4d99-83cd-91e4a383b108-kube-api-access-v2p77\") pod \"dnsmasq-dns-675f4bcbfc-jntgm\" (UID: \"31d47b29-48d4-4d99-83cd-91e4a383b108\") " pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.455142 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-59q7v\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.455214 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cwch\" (UniqueName: \"kubernetes.io/projected/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-kube-api-access-4cwch\") pod \"dnsmasq-dns-78dd6ddcc-59q7v\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.455238 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-config\") pod \"dnsmasq-dns-78dd6ddcc-59q7v\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.456365 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-59q7v\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.456411 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-config\") pod \"dnsmasq-dns-78dd6ddcc-59q7v\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.473782 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cwch\" (UniqueName: \"kubernetes.io/projected/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-kube-api-access-4cwch\") pod \"dnsmasq-dns-78dd6ddcc-59q7v\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.540861 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.615349 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.839699 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jntgm"] Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.867420 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" event={"ID":"31d47b29-48d4-4d99-83cd-91e4a383b108","Type":"ContainerStarted","Data":"8f15fd9869489ff68ee5a3753b44f6ce99f1cbc8b027560e79839150efb20498"} Mar 20 08:43:13 crc kubenswrapper[4903]: I0320 08:43:13.906623 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-59q7v"] Mar 20 08:43:13 crc kubenswrapper[4903]: W0320 08:43:13.910262 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2064f5b2_4d54_41b2_a04d_bc35b586cb3c.slice/crio-d9b393449f6f367fd66f79089c7c2eda51750ed53d7c08bf1c64742743e8aa8b WatchSource:0}: Error finding container d9b393449f6f367fd66f79089c7c2eda51750ed53d7c08bf1c64742743e8aa8b: Status 404 returned error can't find the container with id d9b393449f6f367fd66f79089c7c2eda51750ed53d7c08bf1c64742743e8aa8b Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.114749 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jntgm"] Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.144618 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7wjsx"] Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.145818 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.160308 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7wjsx"] Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.166374 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4vm7\" (UniqueName: \"kubernetes.io/projected/57f5e312-d8c8-420d-8655-8ace1519bdda-kube-api-access-m4vm7\") pod \"dnsmasq-dns-666b6646f7-7wjsx\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.166480 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-config\") pod \"dnsmasq-dns-666b6646f7-7wjsx\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.166653 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-dns-svc\") pod \"dnsmasq-dns-666b6646f7-7wjsx\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.267636 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-dns-svc\") pod \"dnsmasq-dns-666b6646f7-7wjsx\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.267721 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4vm7\" (UniqueName: \"kubernetes.io/projected/57f5e312-d8c8-420d-8655-8ace1519bdda-kube-api-access-m4vm7\") pod \"dnsmasq-dns-666b6646f7-7wjsx\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.267754 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-config\") pod \"dnsmasq-dns-666b6646f7-7wjsx\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.268531 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-config\") pod \"dnsmasq-dns-666b6646f7-7wjsx\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.268531 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-dns-svc\") pod \"dnsmasq-dns-666b6646f7-7wjsx\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.292674 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4vm7\" (UniqueName: \"kubernetes.io/projected/57f5e312-d8c8-420d-8655-8ace1519bdda-kube-api-access-m4vm7\") pod \"dnsmasq-dns-666b6646f7-7wjsx\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.462173 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.717136 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7wjsx"] Mar 20 08:43:14 crc kubenswrapper[4903]: W0320 08:43:14.736976 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57f5e312_d8c8_420d_8655_8ace1519bdda.slice/crio-ccd168348dbdaf720e354fb128134a658f4e7161d666ec8684ae580a49d3876c WatchSource:0}: Error finding container ccd168348dbdaf720e354fb128134a658f4e7161d666ec8684ae580a49d3876c: Status 404 returned error can't find the container with id ccd168348dbdaf720e354fb128134a658f4e7161d666ec8684ae580a49d3876c Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.887591 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" event={"ID":"2064f5b2-4d54-41b2-a04d-bc35b586cb3c","Type":"ContainerStarted","Data":"d9b393449f6f367fd66f79089c7c2eda51750ed53d7c08bf1c64742743e8aa8b"} Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.889278 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" event={"ID":"57f5e312-d8c8-420d-8655-8ace1519bdda","Type":"ContainerStarted","Data":"ccd168348dbdaf720e354fb128134a658f4e7161d666ec8684ae580a49d3876c"} Mar 20 08:43:14 crc kubenswrapper[4903]: I0320 08:43:14.978133 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-59q7v"] Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.006564 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s7lkc"] Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.007882 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.021350 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s7lkc"] Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.180420 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s7lkc\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.180509 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzv9j\" (UniqueName: \"kubernetes.io/projected/4b2ae24d-7a7f-450d-b9f0-29773070bfba-kube-api-access-tzv9j\") pod \"dnsmasq-dns-57d769cc4f-s7lkc\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.180532 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-config\") pod \"dnsmasq-dns-57d769cc4f-s7lkc\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.281622 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s7lkc\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.282137 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzv9j\" (UniqueName: \"kubernetes.io/projected/4b2ae24d-7a7f-450d-b9f0-29773070bfba-kube-api-access-tzv9j\") pod \"dnsmasq-dns-57d769cc4f-s7lkc\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.282178 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-config\") pod \"dnsmasq-dns-57d769cc4f-s7lkc\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.283251 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-config\") pod \"dnsmasq-dns-57d769cc4f-s7lkc\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.283757 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s7lkc\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.313813 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzv9j\" (UniqueName: \"kubernetes.io/projected/4b2ae24d-7a7f-450d-b9f0-29773070bfba-kube-api-access-tzv9j\") pod \"dnsmasq-dns-57d769cc4f-s7lkc\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.352321 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.354651 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.355860 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.360147 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.360413 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-b97c6" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.360611 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.360814 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.360987 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.363859 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.364870 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.388545 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.494704 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.494767 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.494806 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.494829 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-server-conf\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.494918 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.495024 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.495120 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c54mp\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-kube-api-access-c54mp\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.496656 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.496715 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-pod-info\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.496754 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.496869 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598530 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598588 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-pod-info\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598615 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598659 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598701 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598728 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598750 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598769 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598790 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-server-conf\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598811 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.598856 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c54mp\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-kube-api-access-c54mp\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.599536 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.599696 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.600119 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.600590 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.601410 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.605456 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.608514 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.609260 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-pod-info\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.609978 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.614103 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-server-conf\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.622937 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c54mp\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-kube-api-access-c54mp\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.638249 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " pod="openstack/rabbitmq-server-0" Mar 20 08:43:15 crc kubenswrapper[4903]: I0320 08:43:15.698424 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.039798 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s7lkc"] Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.140129 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.143584 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.146519 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.147521 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.147599 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.147683 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.147946 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.147956 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.149405 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mtrzn" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.155591 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.314982 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.315023 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.315066 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df937948-08c4-447c-9450-07221ce76552-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.315089 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df937948-08c4-447c-9450-07221ce76552-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.315109 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djs78\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-kube-api-access-djs78\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.315134 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.315172 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.315201 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.315239 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.315255 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.315273 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.359700 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.416631 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.416688 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417192 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417214 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417238 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417284 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417305 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417334 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df937948-08c4-447c-9450-07221ce76552-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417358 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df937948-08c4-447c-9450-07221ce76552-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417379 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djs78\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-kube-api-access-djs78\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417407 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417668 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.417693 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.422422 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.423141 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.423975 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.427306 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df937948-08c4-447c-9450-07221ce76552-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.427695 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.428761 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.430635 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df937948-08c4-447c-9450-07221ce76552-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.447655 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.453518 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djs78\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-kube-api-access-djs78\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.458605 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.474713 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.917911 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509","Type":"ContainerStarted","Data":"50cdcb1ab3eaee1006c1555da57117c90c23c9e1de732b5014ad14a0c2ce59cb"} Mar 20 08:43:16 crc kubenswrapper[4903]: I0320 08:43:16.919592 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" event={"ID":"4b2ae24d-7a7f-450d-b9f0-29773070bfba","Type":"ContainerStarted","Data":"4fa5272493ac50700db60e02bd4f9eeff67f5018dded459ea2823554e363b304"} Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.036113 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:43:17 crc kubenswrapper[4903]: W0320 08:43:17.040196 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf937948_08c4_447c_9450_07221ce76552.slice/crio-91715135ba864eae23740ffcb37edde4b941086e9d6b0fd4623a13a3b8ec6cf0 WatchSource:0}: Error finding container 91715135ba864eae23740ffcb37edde4b941086e9d6b0fd4623a13a3b8ec6cf0: Status 404 returned error can't find the container with id 91715135ba864eae23740ffcb37edde4b941086e9d6b0fd4623a13a3b8ec6cf0 Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.525655 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.528607 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.531000 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.531611 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-x6l5r" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.533664 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.533719 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.550788 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.554268 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.639716 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.639813 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.639879 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-default\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.639903 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-kolla-config\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.639930 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.639970 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptgpx\" (UniqueName: \"kubernetes.io/projected/96a68183-d440-4f89-887d-d2441d00c8e4-kube-api-access-ptgpx\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.640005 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.640126 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.741334 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.741407 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.741451 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.741480 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-default\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.741500 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-kolla-config\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.741517 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.741536 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptgpx\" (UniqueName: \"kubernetes.io/projected/96a68183-d440-4f89-887d-d2441d00c8e4-kube-api-access-ptgpx\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.741562 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.741909 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.743165 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-kolla-config\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.743858 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.744336 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-default\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.760578 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.760716 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.760754 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.763596 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptgpx\" (UniqueName: \"kubernetes.io/projected/96a68183-d440-4f89-887d-d2441d00c8e4-kube-api-access-ptgpx\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.769970 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.863014 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 08:43:17 crc kubenswrapper[4903]: I0320 08:43:17.936815 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"df937948-08c4-447c-9450-07221ce76552","Type":"ContainerStarted","Data":"91715135ba864eae23740ffcb37edde4b941086e9d6b0fd4623a13a3b8ec6cf0"} Mar 20 08:43:18 crc kubenswrapper[4903]: I0320 08:43:18.434090 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 08:43:18 crc kubenswrapper[4903]: W0320 08:43:18.468380 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96a68183_d440_4f89_887d_d2441d00c8e4.slice/crio-b564313f743e3cc52d8be5ffa4e55cfc64ecfd415feda7073b827864c2307ce8 WatchSource:0}: Error finding container b564313f743e3cc52d8be5ffa4e55cfc64ecfd415feda7073b827864c2307ce8: Status 404 returned error can't find the container with id b564313f743e3cc52d8be5ffa4e55cfc64ecfd415feda7073b827864c2307ce8 Mar 20 08:43:18 crc kubenswrapper[4903]: I0320 08:43:18.963498 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"96a68183-d440-4f89-887d-d2441d00c8e4","Type":"ContainerStarted","Data":"b564313f743e3cc52d8be5ffa4e55cfc64ecfd415feda7073b827864c2307ce8"} Mar 20 08:43:18 crc kubenswrapper[4903]: I0320 08:43:18.997428 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.002135 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.010958 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.011012 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.011097 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.010962 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-kdtl4" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.046735 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.081250 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.081295 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw5bc\" (UniqueName: \"kubernetes.io/projected/6e4027bc-3929-4b8b-9538-ab67f779558c-kube-api-access-dw5bc\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.081360 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.081460 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.081863 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.081908 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.081934 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.082072 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.184590 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.184666 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.184700 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.184724 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.184781 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.184812 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.184828 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw5bc\" (UniqueName: \"kubernetes.io/projected/6e4027bc-3929-4b8b-9538-ab67f779558c-kube-api-access-dw5bc\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.184855 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.188737 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.189821 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.190712 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.193336 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.198162 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.198607 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-ldvfc" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.198782 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.199389 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.199806 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.202944 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.214936 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.221546 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.224159 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw5bc\" (UniqueName: \"kubernetes.io/projected/6e4027bc-3929-4b8b-9538-ab67f779558c-kube-api-access-dw5bc\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.229287 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.243882 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.294683 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kolla-config\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.294730 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-config-data\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.294763 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfjw5\" (UniqueName: \"kubernetes.io/projected/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kube-api-access-sfjw5\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.294794 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-combined-ca-bundle\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.294846 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-memcached-tls-certs\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.340848 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.396425 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kolla-config\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.396480 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-config-data\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.396520 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfjw5\" (UniqueName: \"kubernetes.io/projected/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kube-api-access-sfjw5\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.396560 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-combined-ca-bundle\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.396623 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-memcached-tls-certs\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.398320 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kolla-config\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.400617 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-memcached-tls-certs\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.401808 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-combined-ca-bundle\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.414215 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-config-data\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.419719 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfjw5\" (UniqueName: \"kubernetes.io/projected/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kube-api-access-sfjw5\") pod \"memcached-0\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " pod="openstack/memcached-0" Mar 20 08:43:19 crc kubenswrapper[4903]: I0320 08:43:19.601241 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 08:43:21 crc kubenswrapper[4903]: I0320 08:43:21.608989 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:43:21 crc kubenswrapper[4903]: I0320 08:43:21.610102 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 08:43:21 crc kubenswrapper[4903]: I0320 08:43:21.617829 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-pjzxg" Mar 20 08:43:21 crc kubenswrapper[4903]: I0320 08:43:21.618517 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:43:21 crc kubenswrapper[4903]: I0320 08:43:21.752562 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wv26\" (UniqueName: \"kubernetes.io/projected/c68a4ea3-7086-430c-8a78-a4c06ed7280c-kube-api-access-9wv26\") pod \"kube-state-metrics-0\" (UID: \"c68a4ea3-7086-430c-8a78-a4c06ed7280c\") " pod="openstack/kube-state-metrics-0" Mar 20 08:43:21 crc kubenswrapper[4903]: I0320 08:43:21.854402 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wv26\" (UniqueName: \"kubernetes.io/projected/c68a4ea3-7086-430c-8a78-a4c06ed7280c-kube-api-access-9wv26\") pod \"kube-state-metrics-0\" (UID: \"c68a4ea3-7086-430c-8a78-a4c06ed7280c\") " pod="openstack/kube-state-metrics-0" Mar 20 08:43:21 crc kubenswrapper[4903]: I0320 08:43:21.878759 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wv26\" (UniqueName: \"kubernetes.io/projected/c68a4ea3-7086-430c-8a78-a4c06ed7280c-kube-api-access-9wv26\") pod \"kube-state-metrics-0\" (UID: \"c68a4ea3-7086-430c-8a78-a4c06ed7280c\") " pod="openstack/kube-state-metrics-0" Mar 20 08:43:21 crc kubenswrapper[4903]: I0320 08:43:21.951123 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.405773 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.408785 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.414420 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.414619 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.418497 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.419068 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-7zgxt" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.421108 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.421815 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.519932 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.520004 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.520079 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.520121 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk8ft\" (UniqueName: \"kubernetes.io/projected/8639b665-721c-4dda-afe9-6e84f6f8a574-kube-api-access-rk8ft\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.520158 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-config\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.520208 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.520502 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.520708 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.622408 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.622498 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.622554 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.622607 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.622641 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.622665 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.622681 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk8ft\" (UniqueName: \"kubernetes.io/projected/8639b665-721c-4dda-afe9-6e84f6f8a574-kube-api-access-rk8ft\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.622699 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-config\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.623599 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-config\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.624805 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.625273 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.625925 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.639051 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.639616 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.643971 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.681947 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk8ft\" (UniqueName: \"kubernetes.io/projected/8639b665-721c-4dda-afe9-6e84f6f8a574-kube-api-access-rk8ft\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:25 crc kubenswrapper[4903]: I0320 08:43:25.755315 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.044379 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.678773 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wdtrn"] Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.681993 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.684544 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-dcm5c" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.684797 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.684922 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.691102 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wdtrn"] Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.744237 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-chrhv"] Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.747218 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.770742 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-chrhv"] Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.788345 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.788430 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-log-ovn\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.788502 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv6r7\" (UniqueName: \"kubernetes.io/projected/7bbbd0a7-f915-4197-bde8-4f96590c454f-kube-api-access-nv6r7\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.788541 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-ovn-controller-tls-certs\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.788568 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bbbd0a7-f915-4197-bde8-4f96590c454f-scripts\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.788600 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run-ovn\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.788617 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-combined-ca-bundle\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.889910 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-ovn-controller-tls-certs\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.889977 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9v2l\" (UniqueName: \"kubernetes.io/projected/d69915e4-0df8-4d83-b096-962eadc1883f-kube-api-access-d9v2l\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890017 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-log-ovn\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890057 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-run\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890094 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69915e4-0df8-4d83-b096-962eadc1883f-scripts\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890111 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-log\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890331 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-lib\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890416 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bbbd0a7-f915-4197-bde8-4f96590c454f-scripts\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890471 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-etc-ovs\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890521 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run-ovn\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890580 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-log-ovn\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890596 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-combined-ca-bundle\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890691 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.890863 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv6r7\" (UniqueName: \"kubernetes.io/projected/7bbbd0a7-f915-4197-bde8-4f96590c454f-kube-api-access-nv6r7\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.891346 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.891456 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run-ovn\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.893559 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bbbd0a7-f915-4197-bde8-4f96590c454f-scripts\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.895658 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-ovn-controller-tls-certs\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.896526 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-combined-ca-bundle\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.906330 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv6r7\" (UniqueName: \"kubernetes.io/projected/7bbbd0a7-f915-4197-bde8-4f96590c454f-kube-api-access-nv6r7\") pod \"ovn-controller-wdtrn\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.992812 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9v2l\" (UniqueName: \"kubernetes.io/projected/d69915e4-0df8-4d83-b096-962eadc1883f-kube-api-access-d9v2l\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.992882 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-run\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.992924 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69915e4-0df8-4d83-b096-962eadc1883f-scripts\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.992942 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-log\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.992974 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-lib\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.992999 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-etc-ovs\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.993708 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-run\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.994323 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-log\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.994566 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-lib\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.995767 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69915e4-0df8-4d83-b096-962eadc1883f-scripts\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:26 crc kubenswrapper[4903]: I0320 08:43:26.995880 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-etc-ovs\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:27 crc kubenswrapper[4903]: I0320 08:43:27.005405 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:27 crc kubenswrapper[4903]: I0320 08:43:27.014948 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9v2l\" (UniqueName: \"kubernetes.io/projected/d69915e4-0df8-4d83-b096-962eadc1883f-kube-api-access-d9v2l\") pod \"ovn-controller-ovs-chrhv\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:27 crc kubenswrapper[4903]: I0320 08:43:27.071525 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.310850 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.312705 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.317883 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.319085 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-zvmq9" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.321134 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.321444 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.324988 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.432155 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.432240 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.432269 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-config\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.432301 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.432331 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.432357 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.432389 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddh9\" (UniqueName: \"kubernetes.io/projected/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-kube-api-access-xddh9\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.432414 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.536875 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.536941 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddh9\" (UniqueName: \"kubernetes.io/projected/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-kube-api-access-xddh9\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.536964 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.536998 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.537058 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.537080 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-config\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.537110 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.537137 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.537869 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.538277 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.546933 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-config\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.548382 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.550685 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.553575 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.561069 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.564495 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.570406 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddh9\" (UniqueName: \"kubernetes.io/projected/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-kube-api-access-xddh9\") pod \"ovsdbserver-nb-0\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:28 crc kubenswrapper[4903]: I0320 08:43:28.684836 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:32 crc kubenswrapper[4903]: E0320 08:43:32.954768 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 20 08:43:32 crc kubenswrapper[4903]: E0320 08:43:32.955289 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptgpx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(96a68183-d440-4f89-887d-d2441d00c8e4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:43:32 crc kubenswrapper[4903]: E0320 08:43:32.956483 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="96a68183-d440-4f89-887d-d2441d00c8e4" Mar 20 08:43:33 crc kubenswrapper[4903]: E0320 08:43:33.101864 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="96a68183-d440-4f89-887d-d2441d00c8e4" Mar 20 08:43:36 crc kubenswrapper[4903]: E0320 08:43:36.884404 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 20 08:43:36 crc kubenswrapper[4903]: E0320 08:43:36.885411 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djs78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(df937948-08c4-447c-9450-07221ce76552): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:43:36 crc kubenswrapper[4903]: E0320 08:43:36.886644 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="df937948-08c4-447c-9450-07221ce76552" Mar 20 08:43:36 crc kubenswrapper[4903]: E0320 08:43:36.892198 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 20 08:43:36 crc kubenswrapper[4903]: E0320 08:43:36.892767 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c54mp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(888a3fd9-01f8-47b3-b1bb-f2b8b6b96509): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:43:36 crc kubenswrapper[4903]: E0320 08:43:36.894951 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" Mar 20 08:43:37 crc kubenswrapper[4903]: E0320 08:43:37.142024 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="df937948-08c4-447c-9450-07221ce76552" Mar 20 08:43:37 crc kubenswrapper[4903]: E0320 08:43:37.142272 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" Mar 20 08:43:42 crc kubenswrapper[4903]: I0320 08:43:42.280254 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:43:42 crc kubenswrapper[4903]: W0320 08:43:42.814828 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc68a4ea3_7086_430c_8a78_a4c06ed7280c.slice/crio-66b814afd37a94e3be1c157d99f158390e80fa86e45a769f589683a1498e8170 WatchSource:0}: Error finding container 66b814afd37a94e3be1c157d99f158390e80fa86e45a769f589683a1498e8170: Status 404 returned error can't find the container with id 66b814afd37a94e3be1c157d99f158390e80fa86e45a769f589683a1498e8170 Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.838401 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.838656 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tzv9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-s7lkc_openstack(4b2ae24d-7a7f-450d-b9f0-29773070bfba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.840211 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" podUID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.854559 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.854770 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m4vm7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-7wjsx_openstack(57f5e312-d8c8-420d-8655-8ace1519bdda): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.856213 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" podUID="57f5e312-d8c8-420d-8655-8ace1519bdda" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.869932 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.870140 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cwch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-59q7v_openstack(2064f5b2-4d54-41b2-a04d-bc35b586cb3c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.871520 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" podUID="2064f5b2-4d54-41b2-a04d-bc35b586cb3c" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.898109 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.898258 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2p77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-jntgm_openstack(31d47b29-48d4-4d99-83cd-91e4a383b108): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:43:42 crc kubenswrapper[4903]: E0320 08:43:42.901482 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" podUID="31d47b29-48d4-4d99-83cd-91e4a383b108" Mar 20 08:43:43 crc kubenswrapper[4903]: E0320 08:43:43.203271 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" podUID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" Mar 20 08:43:43 crc kubenswrapper[4903]: E0320 08:43:43.204024 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" podUID="57f5e312-d8c8-420d-8655-8ace1519bdda" Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.200475 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c68a4ea3-7086-430c-8a78-a4c06ed7280c","Type":"ContainerStarted","Data":"66b814afd37a94e3be1c157d99f158390e80fa86e45a769f589683a1498e8170"} Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.337962 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 08:43:43 crc kubenswrapper[4903]: W0320 08:43:43.352352 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e4027bc_3929_4b8b_9538_ab67f779558c.slice/crio-fa3b33570168c7b6cfde01bb87b4f92251ca9fed546cc635ba6b71bd9474b4f9 WatchSource:0}: Error finding container fa3b33570168c7b6cfde01bb87b4f92251ca9fed546cc635ba6b71bd9474b4f9: Status 404 returned error can't find the container with id fa3b33570168c7b6cfde01bb87b4f92251ca9fed546cc635ba6b71bd9474b4f9 Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.416971 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 08:43:43 crc kubenswrapper[4903]: W0320 08:43:43.506508 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34de9984_0547_4ba1_ae7d_5f8cc9196c26.slice/crio-2c4393d7e307ba72b53c3157d0aaef61008e16bbb978c185a6b2752dec453422 WatchSource:0}: Error finding container 2c4393d7e307ba72b53c3157d0aaef61008e16bbb978c185a6b2752dec453422: Status 404 returned error can't find the container with id 2c4393d7e307ba72b53c3157d0aaef61008e16bbb978c185a6b2752dec453422 Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.574779 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.627133 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wdtrn"] Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.838266 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.842731 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-dns-svc\") pod \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.842832 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.842857 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-config\") pod \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.843148 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cwch\" (UniqueName: \"kubernetes.io/projected/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-kube-api-access-4cwch\") pod \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\" (UID: \"2064f5b2-4d54-41b2-a04d-bc35b586cb3c\") " Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.843461 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-config" (OuterVolumeSpecName: "config") pod "2064f5b2-4d54-41b2-a04d-bc35b586cb3c" (UID: "2064f5b2-4d54-41b2-a04d-bc35b586cb3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.843474 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2064f5b2-4d54-41b2-a04d-bc35b586cb3c" (UID: "2064f5b2-4d54-41b2-a04d-bc35b586cb3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.843751 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.843778 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.853558 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-kube-api-access-4cwch" (OuterVolumeSpecName: "kube-api-access-4cwch") pod "2064f5b2-4d54-41b2-a04d-bc35b586cb3c" (UID: "2064f5b2-4d54-41b2-a04d-bc35b586cb3c"). InnerVolumeSpecName "kube-api-access-4cwch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.944131 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d47b29-48d4-4d99-83cd-91e4a383b108-config\") pod \"31d47b29-48d4-4d99-83cd-91e4a383b108\" (UID: \"31d47b29-48d4-4d99-83cd-91e4a383b108\") " Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.944256 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2p77\" (UniqueName: \"kubernetes.io/projected/31d47b29-48d4-4d99-83cd-91e4a383b108-kube-api-access-v2p77\") pod \"31d47b29-48d4-4d99-83cd-91e4a383b108\" (UID: \"31d47b29-48d4-4d99-83cd-91e4a383b108\") " Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.944578 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cwch\" (UniqueName: \"kubernetes.io/projected/2064f5b2-4d54-41b2-a04d-bc35b586cb3c-kube-api-access-4cwch\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.946158 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d47b29-48d4-4d99-83cd-91e4a383b108-config" (OuterVolumeSpecName: "config") pod "31d47b29-48d4-4d99-83cd-91e4a383b108" (UID: "31d47b29-48d4-4d99-83cd-91e4a383b108"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:43 crc kubenswrapper[4903]: I0320 08:43:43.952184 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d47b29-48d4-4d99-83cd-91e4a383b108-kube-api-access-v2p77" (OuterVolumeSpecName: "kube-api-access-v2p77") pod "31d47b29-48d4-4d99-83cd-91e4a383b108" (UID: "31d47b29-48d4-4d99-83cd-91e4a383b108"). InnerVolumeSpecName "kube-api-access-v2p77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.046774 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d47b29-48d4-4d99-83cd-91e4a383b108-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.046812 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2p77\" (UniqueName: \"kubernetes.io/projected/31d47b29-48d4-4d99-83cd-91e4a383b108-kube-api-access-v2p77\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.207681 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"34de9984-0547-4ba1-ae7d-5f8cc9196c26","Type":"ContainerStarted","Data":"2c4393d7e307ba72b53c3157d0aaef61008e16bbb978c185a6b2752dec453422"} Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.210910 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6e4027bc-3929-4b8b-9538-ab67f779558c","Type":"ContainerStarted","Data":"fa3b33570168c7b6cfde01bb87b4f92251ca9fed546cc635ba6b71bd9474b4f9"} Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.216116 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" event={"ID":"2064f5b2-4d54-41b2-a04d-bc35b586cb3c","Type":"ContainerDied","Data":"d9b393449f6f367fd66f79089c7c2eda51750ed53d7c08bf1c64742743e8aa8b"} Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.216259 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-59q7v" Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.218367 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdtrn" event={"ID":"7bbbd0a7-f915-4197-bde8-4f96590c454f","Type":"ContainerStarted","Data":"d4340d060ecd66c00269d5a245a982ed785b2362f4a3fc754d476cc61ccbc868"} Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.220061 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" event={"ID":"31d47b29-48d4-4d99-83cd-91e4a383b108","Type":"ContainerDied","Data":"8f15fd9869489ff68ee5a3753b44f6ce99f1cbc8b027560e79839150efb20498"} Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.220153 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-jntgm" Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.221792 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"548096cf-0b33-4f2f-b8be-7d1ac859cf7c","Type":"ContainerStarted","Data":"d40cb7084592ee171758abb523548af0c26165735b34deee1818646086e6b1c6"} Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.324550 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-59q7v"] Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.336751 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-59q7v"] Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.352589 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jntgm"] Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.361931 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-jntgm"] Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.469916 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-chrhv"] Mar 20 08:43:44 crc kubenswrapper[4903]: I0320 08:43:44.570897 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 08:43:44 crc kubenswrapper[4903]: W0320 08:43:44.726356 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8639b665_721c_4dda_afe9_6e84f6f8a574.slice/crio-505d0191451c5ecb3a6db6fcddf61679951a2efd3fa88075c72fddeb5d91d8dd WatchSource:0}: Error finding container 505d0191451c5ecb3a6db6fcddf61679951a2efd3fa88075c72fddeb5d91d8dd: Status 404 returned error can't find the container with id 505d0191451c5ecb3a6db6fcddf61679951a2efd3fa88075c72fddeb5d91d8dd Mar 20 08:43:44 crc kubenswrapper[4903]: W0320 08:43:44.732570 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd69915e4_0df8_4d83_b096_962eadc1883f.slice/crio-296a07fd3728917cdd1ba15c6f90eb1426775d773d30351e992214c75ce92029 WatchSource:0}: Error finding container 296a07fd3728917cdd1ba15c6f90eb1426775d773d30351e992214c75ce92029: Status 404 returned error can't find the container with id 296a07fd3728917cdd1ba15c6f90eb1426775d773d30351e992214c75ce92029 Mar 20 08:43:45 crc kubenswrapper[4903]: I0320 08:43:45.242426 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-chrhv" event={"ID":"d69915e4-0df8-4d83-b096-962eadc1883f","Type":"ContainerStarted","Data":"296a07fd3728917cdd1ba15c6f90eb1426775d773d30351e992214c75ce92029"} Mar 20 08:43:45 crc kubenswrapper[4903]: I0320 08:43:45.244326 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8639b665-721c-4dda-afe9-6e84f6f8a574","Type":"ContainerStarted","Data":"505d0191451c5ecb3a6db6fcddf61679951a2efd3fa88075c72fddeb5d91d8dd"} Mar 20 08:43:45 crc kubenswrapper[4903]: I0320 08:43:45.504910 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2064f5b2-4d54-41b2-a04d-bc35b586cb3c" path="/var/lib/kubelet/pods/2064f5b2-4d54-41b2-a04d-bc35b586cb3c/volumes" Mar 20 08:43:45 crc kubenswrapper[4903]: I0320 08:43:45.506421 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d47b29-48d4-4d99-83cd-91e4a383b108" path="/var/lib/kubelet/pods/31d47b29-48d4-4d99-83cd-91e4a383b108/volumes" Mar 20 08:43:48 crc kubenswrapper[4903]: I0320 08:43:48.279305 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6e4027bc-3929-4b8b-9538-ab67f779558c","Type":"ContainerStarted","Data":"35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f"} Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.289778 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"548096cf-0b33-4f2f-b8be-7d1ac859cf7c","Type":"ContainerStarted","Data":"f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0"} Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.291613 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"34de9984-0547-4ba1-ae7d-5f8cc9196c26","Type":"ContainerStarted","Data":"b0e941c6eb837ac5cb4a6c3fdea1d2d35578b2c2d7f9749cd5b2c8e0e7710ff8"} Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.291787 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.293305 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c68a4ea3-7086-430c-8a78-a4c06ed7280c","Type":"ContainerStarted","Data":"0a9e3dfbd5e32190c5719e3a6f48141efd8d572d7580de2fe86b186aa11f1165"} Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.293444 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.296290 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdtrn" event={"ID":"7bbbd0a7-f915-4197-bde8-4f96590c454f","Type":"ContainerStarted","Data":"2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f"} Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.296447 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wdtrn" Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.298311 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-chrhv" event={"ID":"d69915e4-0df8-4d83-b096-962eadc1883f","Type":"ContainerStarted","Data":"ef6aa1fdb1f8f9b74915a49b26f2800fbc354505df15acabc86a9628fcc4c35f"} Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.300659 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8639b665-721c-4dda-afe9-6e84f6f8a574","Type":"ContainerStarted","Data":"81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a"} Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.303430 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"96a68183-d440-4f89-887d-d2441d00c8e4","Type":"ContainerStarted","Data":"e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd"} Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.319699 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=25.225404226 podStartE2EDuration="30.319673623s" podCreationTimestamp="2026-03-20 08:43:19 +0000 UTC" firstStartedPulling="2026-03-20 08:43:43.526231123 +0000 UTC m=+1248.743131438" lastFinishedPulling="2026-03-20 08:43:48.62050052 +0000 UTC m=+1253.837400835" observedRunningTime="2026-03-20 08:43:49.307053747 +0000 UTC m=+1254.523954072" watchObservedRunningTime="2026-03-20 08:43:49.319673623 +0000 UTC m=+1254.536573938" Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.338639 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wdtrn" podStartSLOduration=18.361989096 podStartE2EDuration="23.338615811s" podCreationTimestamp="2026-03-20 08:43:26 +0000 UTC" firstStartedPulling="2026-03-20 08:43:43.736266045 +0000 UTC m=+1248.953166360" lastFinishedPulling="2026-03-20 08:43:48.71289272 +0000 UTC m=+1253.929793075" observedRunningTime="2026-03-20 08:43:49.334357728 +0000 UTC m=+1254.551258043" watchObservedRunningTime="2026-03-20 08:43:49.338615811 +0000 UTC m=+1254.555516126" Mar 20 08:43:49 crc kubenswrapper[4903]: I0320 08:43:49.360140 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=22.491242534 podStartE2EDuration="28.360118813s" podCreationTimestamp="2026-03-20 08:43:21 +0000 UTC" firstStartedPulling="2026-03-20 08:43:42.824723244 +0000 UTC m=+1248.041623559" lastFinishedPulling="2026-03-20 08:43:48.693599483 +0000 UTC m=+1253.910499838" observedRunningTime="2026-03-20 08:43:49.352500078 +0000 UTC m=+1254.569400393" watchObservedRunningTime="2026-03-20 08:43:49.360118813 +0000 UTC m=+1254.577019138" Mar 20 08:43:50 crc kubenswrapper[4903]: I0320 08:43:50.315274 4903 generic.go:334] "Generic (PLEG): container finished" podID="d69915e4-0df8-4d83-b096-962eadc1883f" containerID="ef6aa1fdb1f8f9b74915a49b26f2800fbc354505df15acabc86a9628fcc4c35f" exitCode=0 Mar 20 08:43:50 crc kubenswrapper[4903]: I0320 08:43:50.315356 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-chrhv" event={"ID":"d69915e4-0df8-4d83-b096-962eadc1883f","Type":"ContainerDied","Data":"ef6aa1fdb1f8f9b74915a49b26f2800fbc354505df15acabc86a9628fcc4c35f"} Mar 20 08:43:51 crc kubenswrapper[4903]: I0320 08:43:51.342746 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-chrhv" event={"ID":"d69915e4-0df8-4d83-b096-962eadc1883f","Type":"ContainerStarted","Data":"ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677"} Mar 20 08:43:52 crc kubenswrapper[4903]: I0320 08:43:52.354779 4903 generic.go:334] "Generic (PLEG): container finished" podID="6e4027bc-3929-4b8b-9538-ab67f779558c" containerID="35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f" exitCode=0 Mar 20 08:43:52 crc kubenswrapper[4903]: I0320 08:43:52.354912 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6e4027bc-3929-4b8b-9538-ab67f779558c","Type":"ContainerDied","Data":"35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f"} Mar 20 08:43:52 crc kubenswrapper[4903]: I0320 08:43:52.358710 4903 generic.go:334] "Generic (PLEG): container finished" podID="96a68183-d440-4f89-887d-d2441d00c8e4" containerID="e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd" exitCode=0 Mar 20 08:43:52 crc kubenswrapper[4903]: I0320 08:43:52.358764 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"96a68183-d440-4f89-887d-d2441d00c8e4","Type":"ContainerDied","Data":"e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd"} Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.369329 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"548096cf-0b33-4f2f-b8be-7d1ac859cf7c","Type":"ContainerStarted","Data":"a4feda7340d51a69a0ade15a8a02097cea235438a9973df73914b9990363b816"} Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.372412 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6e4027bc-3929-4b8b-9538-ab67f779558c","Type":"ContainerStarted","Data":"fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a"} Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.375901 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-chrhv" event={"ID":"d69915e4-0df8-4d83-b096-962eadc1883f","Type":"ContainerStarted","Data":"cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324"} Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.376052 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.376109 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.377708 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8639b665-721c-4dda-afe9-6e84f6f8a574","Type":"ContainerStarted","Data":"2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8"} Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.379677 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"96a68183-d440-4f89-887d-d2441d00c8e4","Type":"ContainerStarted","Data":"b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9"} Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.399243 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.3073044 podStartE2EDuration="26.399220646s" podCreationTimestamp="2026-03-20 08:43:27 +0000 UTC" firstStartedPulling="2026-03-20 08:43:43.736672205 +0000 UTC m=+1248.953572520" lastFinishedPulling="2026-03-20 08:43:52.828588461 +0000 UTC m=+1258.045488766" observedRunningTime="2026-03-20 08:43:53.396673035 +0000 UTC m=+1258.613573370" watchObservedRunningTime="2026-03-20 08:43:53.399220646 +0000 UTC m=+1258.616120971" Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.427085 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=21.370038668 podStartE2EDuration="29.427063041s" podCreationTimestamp="2026-03-20 08:43:24 +0000 UTC" firstStartedPulling="2026-03-20 08:43:44.730709487 +0000 UTC m=+1249.947609802" lastFinishedPulling="2026-03-20 08:43:52.78773386 +0000 UTC m=+1258.004634175" observedRunningTime="2026-03-20 08:43:53.424605132 +0000 UTC m=+1258.641505437" watchObservedRunningTime="2026-03-20 08:43:53.427063041 +0000 UTC m=+1258.643963346" Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.460243 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371999.394556 podStartE2EDuration="37.460220045s" podCreationTimestamp="2026-03-20 08:43:16 +0000 UTC" firstStartedPulling="2026-03-20 08:43:18.472073866 +0000 UTC m=+1223.688974181" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:43:53.453664386 +0000 UTC m=+1258.670564711" watchObservedRunningTime="2026-03-20 08:43:53.460220045 +0000 UTC m=+1258.677120370" Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.488387 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=35.254737197 podStartE2EDuration="36.488368478s" podCreationTimestamp="2026-03-20 08:43:17 +0000 UTC" firstStartedPulling="2026-03-20 08:43:43.35990793 +0000 UTC m=+1248.576808245" lastFinishedPulling="2026-03-20 08:43:44.593539211 +0000 UTC m=+1249.810439526" observedRunningTime="2026-03-20 08:43:53.48558071 +0000 UTC m=+1258.702481035" watchObservedRunningTime="2026-03-20 08:43:53.488368478 +0000 UTC m=+1258.705268793" Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.515066 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-chrhv" podStartSLOduration=23.543911299 podStartE2EDuration="27.515026774s" podCreationTimestamp="2026-03-20 08:43:26 +0000 UTC" firstStartedPulling="2026-03-20 08:43:44.741764285 +0000 UTC m=+1249.958664600" lastFinishedPulling="2026-03-20 08:43:48.71287977 +0000 UTC m=+1253.929780075" observedRunningTime="2026-03-20 08:43:53.511854487 +0000 UTC m=+1258.728754832" watchObservedRunningTime="2026-03-20 08:43:53.515026774 +0000 UTC m=+1258.731927079" Mar 20 08:43:53 crc kubenswrapper[4903]: I0320 08:43:53.685945 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:54 crc kubenswrapper[4903]: I0320 08:43:54.603265 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 08:43:55 crc kubenswrapper[4903]: I0320 08:43:55.398966 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509","Type":"ContainerStarted","Data":"79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31"} Mar 20 08:43:55 crc kubenswrapper[4903]: I0320 08:43:55.402308 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"df937948-08c4-447c-9450-07221ce76552","Type":"ContainerStarted","Data":"fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9"} Mar 20 08:43:55 crc kubenswrapper[4903]: I0320 08:43:55.685122 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:55 crc kubenswrapper[4903]: I0320 08:43:55.718821 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.044945 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.045045 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.117680 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.410691 4903 generic.go:334] "Generic (PLEG): container finished" podID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" containerID="84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e" exitCode=0 Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.410755 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" event={"ID":"4b2ae24d-7a7f-450d-b9f0-29773070bfba","Type":"ContainerDied","Data":"84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e"} Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.413146 4903 generic.go:334] "Generic (PLEG): container finished" podID="57f5e312-d8c8-420d-8655-8ace1519bdda" containerID="5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b" exitCode=0 Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.413232 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" event={"ID":"57f5e312-d8c8-420d-8655-8ace1519bdda","Type":"ContainerDied","Data":"5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b"} Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.488659 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.538733 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.789479 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s7lkc"] Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.816158 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rjxxk"] Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.817711 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.820785 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.839493 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rjxxk"] Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.899774 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ch9dc"] Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.901000 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.902708 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.902768 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.902794 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-config\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.902839 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jcqb\" (UniqueName: \"kubernetes.io/projected/9e171236-e1e4-41f8-bf89-d162cab4d02b-kube-api-access-4jcqb\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.903369 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.916799 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ch9dc"] Mar 20 08:43:56 crc kubenswrapper[4903]: I0320 08:43:56.995368 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7wjsx"] Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.003932 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-combined-ca-bundle\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.004004 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovs-rundir\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.004048 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jcqb\" (UniqueName: \"kubernetes.io/projected/9e171236-e1e4-41f8-bf89-d162cab4d02b-kube-api-access-4jcqb\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.004423 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovn-rundir\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.004519 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvdx8\" (UniqueName: \"kubernetes.io/projected/f390d60f-9967-4869-b09f-3cea4570186e-kube-api-access-dvdx8\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.004616 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.004654 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.004674 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f390d60f-9967-4869-b09f-3cea4570186e-config\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.004702 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-config\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.004733 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.005555 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.011010 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.011194 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-config\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.035375 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8ckgn"] Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.038773 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jcqb\" (UniqueName: \"kubernetes.io/projected/9e171236-e1e4-41f8-bf89-d162cab4d02b-kube-api-access-4jcqb\") pod \"dnsmasq-dns-7fd796d7df-rjxxk\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.054818 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.058688 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8ckgn"] Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.062631 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.108743 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovn-rundir\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.108821 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdx8\" (UniqueName: \"kubernetes.io/projected/f390d60f-9967-4869-b09f-3cea4570186e-kube-api-access-dvdx8\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.108856 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f390d60f-9967-4869-b09f-3cea4570186e-config\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.108913 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.108939 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-combined-ca-bundle\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.108987 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovs-rundir\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.109388 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovs-rundir\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.109466 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovn-rundir\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.111306 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f390d60f-9967-4869-b09f-3cea4570186e-config\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.111413 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.112843 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.121060 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.121319 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.121555 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.121679 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-jlp8x" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.134811 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.135915 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-combined-ca-bundle\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.138180 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.140522 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.148829 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvdx8\" (UniqueName: \"kubernetes.io/projected/f390d60f-9967-4869-b09f-3cea4570186e-kube-api-access-dvdx8\") pod \"ovn-controller-metrics-ch9dc\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213190 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213276 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm6kz\" (UniqueName: \"kubernetes.io/projected/23a7dc3f-9bad-4898-82c7-203ddf385577-kube-api-access-dm6kz\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213332 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213358 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213405 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213428 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213453 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213486 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-scripts\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213511 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-config\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213547 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg4vt\" (UniqueName: \"kubernetes.io/projected/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-kube-api-access-zg4vt\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213578 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-config\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.213609 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.236603 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.314992 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-config\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315520 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg4vt\" (UniqueName: \"kubernetes.io/projected/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-kube-api-access-zg4vt\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315559 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-config\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315597 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315633 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315685 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm6kz\" (UniqueName: \"kubernetes.io/projected/23a7dc3f-9bad-4898-82c7-203ddf385577-kube-api-access-dm6kz\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315737 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315760 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315792 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315818 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315861 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.315899 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-scripts\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.316023 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-config\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.317854 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.318183 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.320683 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.320811 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-config\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.321171 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-scripts\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.324324 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.330505 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.332683 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.344381 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg4vt\" (UniqueName: \"kubernetes.io/projected/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-kube-api-access-zg4vt\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.345271 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm6kz\" (UniqueName: \"kubernetes.io/projected/23a7dc3f-9bad-4898-82c7-203ddf385577-kube-api-access-dm6kz\") pod \"dnsmasq-dns-86db49b7ff-8ckgn\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.347440 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.396693 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.445996 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" podUID="57f5e312-d8c8-420d-8655-8ace1519bdda" containerName="dnsmasq-dns" containerID="cri-o://5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c" gracePeriod=10 Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.448740 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" event={"ID":"57f5e312-d8c8-420d-8655-8ace1519bdda","Type":"ContainerStarted","Data":"5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c"} Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.448843 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.464657 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" podUID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" containerName="dnsmasq-dns" containerID="cri-o://23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d" gracePeriod=10 Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.466543 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" event={"ID":"4b2ae24d-7a7f-450d-b9f0-29773070bfba","Type":"ContainerStarted","Data":"23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d"} Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.470167 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.492872 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" podStartSLOduration=3.585277818 podStartE2EDuration="43.492849802s" podCreationTimestamp="2026-03-20 08:43:14 +0000 UTC" firstStartedPulling="2026-03-20 08:43:16.065303554 +0000 UTC m=+1221.282203869" lastFinishedPulling="2026-03-20 08:43:55.972875508 +0000 UTC m=+1261.189775853" observedRunningTime="2026-03-20 08:43:57.486957679 +0000 UTC m=+1262.703857994" watchObservedRunningTime="2026-03-20 08:43:57.492849802 +0000 UTC m=+1262.709750117" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.493398 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" podStartSLOduration=2.262292713 podStartE2EDuration="43.493393135s" podCreationTimestamp="2026-03-20 08:43:14 +0000 UTC" firstStartedPulling="2026-03-20 08:43:14.741423827 +0000 UTC m=+1219.958324152" lastFinishedPulling="2026-03-20 08:43:55.972524259 +0000 UTC m=+1261.189424574" observedRunningTime="2026-03-20 08:43:57.470315165 +0000 UTC m=+1262.687215480" watchObservedRunningTime="2026-03-20 08:43:57.493393135 +0000 UTC m=+1262.710293450" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.523918 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.684960 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rjxxk"] Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.817836 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ch9dc"] Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.865723 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.866224 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 08:43:57 crc kubenswrapper[4903]: I0320 08:43:57.974529 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8ckgn"] Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.116741 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 08:43:58 crc kubenswrapper[4903]: W0320 08:43:58.126678 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e43a58_7c5f_49ac_a1f4_f2eddfd28c6b.slice/crio-482479ce04d5f833f81d72dc75f610bb8c7a4314fd274b583ad244727ea260f6 WatchSource:0}: Error finding container 482479ce04d5f833f81d72dc75f610bb8c7a4314fd274b583ad244727ea260f6: Status 404 returned error can't find the container with id 482479ce04d5f833f81d72dc75f610bb8c7a4314fd274b583ad244727ea260f6 Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.184869 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.192064 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.239738 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-dns-svc\") pod \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.240837 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4vm7\" (UniqueName: \"kubernetes.io/projected/57f5e312-d8c8-420d-8655-8ace1519bdda-kube-api-access-m4vm7\") pod \"57f5e312-d8c8-420d-8655-8ace1519bdda\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.241382 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzv9j\" (UniqueName: \"kubernetes.io/projected/4b2ae24d-7a7f-450d-b9f0-29773070bfba-kube-api-access-tzv9j\") pod \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.241494 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-config\") pod \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\" (UID: \"4b2ae24d-7a7f-450d-b9f0-29773070bfba\") " Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.241592 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-config\") pod \"57f5e312-d8c8-420d-8655-8ace1519bdda\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.241760 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-dns-svc\") pod \"57f5e312-d8c8-420d-8655-8ace1519bdda\" (UID: \"57f5e312-d8c8-420d-8655-8ace1519bdda\") " Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.256339 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f5e312-d8c8-420d-8655-8ace1519bdda-kube-api-access-m4vm7" (OuterVolumeSpecName: "kube-api-access-m4vm7") pod "57f5e312-d8c8-420d-8655-8ace1519bdda" (UID: "57f5e312-d8c8-420d-8655-8ace1519bdda"). InnerVolumeSpecName "kube-api-access-m4vm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.261308 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2ae24d-7a7f-450d-b9f0-29773070bfba-kube-api-access-tzv9j" (OuterVolumeSpecName: "kube-api-access-tzv9j") pod "4b2ae24d-7a7f-450d-b9f0-29773070bfba" (UID: "4b2ae24d-7a7f-450d-b9f0-29773070bfba"). InnerVolumeSpecName "kube-api-access-tzv9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.311552 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-config" (OuterVolumeSpecName: "config") pod "57f5e312-d8c8-420d-8655-8ace1519bdda" (UID: "57f5e312-d8c8-420d-8655-8ace1519bdda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.313852 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b2ae24d-7a7f-450d-b9f0-29773070bfba" (UID: "4b2ae24d-7a7f-450d-b9f0-29773070bfba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.315276 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57f5e312-d8c8-420d-8655-8ace1519bdda" (UID: "57f5e312-d8c8-420d-8655-8ace1519bdda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.318296 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-config" (OuterVolumeSpecName: "config") pod "4b2ae24d-7a7f-450d-b9f0-29773070bfba" (UID: "4b2ae24d-7a7f-450d-b9f0-29773070bfba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.343203 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.343430 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4vm7\" (UniqueName: \"kubernetes.io/projected/57f5e312-d8c8-420d-8655-8ace1519bdda-kube-api-access-m4vm7\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.343498 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzv9j\" (UniqueName: \"kubernetes.io/projected/4b2ae24d-7a7f-450d-b9f0-29773070bfba-kube-api-access-tzv9j\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.346191 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b2ae24d-7a7f-450d-b9f0-29773070bfba-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.346227 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.346237 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57f5e312-d8c8-420d-8655-8ace1519bdda-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.469418 4903 generic.go:334] "Generic (PLEG): container finished" podID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" containerID="23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d" exitCode=0 Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.469498 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.469506 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" event={"ID":"4b2ae24d-7a7f-450d-b9f0-29773070bfba","Type":"ContainerDied","Data":"23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.470208 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s7lkc" event={"ID":"4b2ae24d-7a7f-450d-b9f0-29773070bfba","Type":"ContainerDied","Data":"4fa5272493ac50700db60e02bd4f9eeff67f5018dded459ea2823554e363b304"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.470253 4903 scope.go:117] "RemoveContainer" containerID="23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.474597 4903 generic.go:334] "Generic (PLEG): container finished" podID="9e171236-e1e4-41f8-bf89-d162cab4d02b" containerID="1c7859a11f841171a5152e58a2350792815cc733a25210225acfc3d6af166932" exitCode=0 Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.474704 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" event={"ID":"9e171236-e1e4-41f8-bf89-d162cab4d02b","Type":"ContainerDied","Data":"1c7859a11f841171a5152e58a2350792815cc733a25210225acfc3d6af166932"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.474740 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" event={"ID":"9e171236-e1e4-41f8-bf89-d162cab4d02b","Type":"ContainerStarted","Data":"bdff9c190cc14ae77a67d8cf3999e051d749ceb99270dade70e0ae12715f410e"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.482276 4903 generic.go:334] "Generic (PLEG): container finished" podID="57f5e312-d8c8-420d-8655-8ace1519bdda" containerID="5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c" exitCode=0 Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.482529 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" event={"ID":"57f5e312-d8c8-420d-8655-8ace1519bdda","Type":"ContainerDied","Data":"5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.482583 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" event={"ID":"57f5e312-d8c8-420d-8655-8ace1519bdda","Type":"ContainerDied","Data":"ccd168348dbdaf720e354fb128134a658f4e7161d666ec8684ae580a49d3876c"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.482668 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-7wjsx" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.484983 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ch9dc" event={"ID":"f390d60f-9967-4869-b09f-3cea4570186e","Type":"ContainerStarted","Data":"16ea7c452104939a759a6fd64f9392015afdf41df82ecdb14f0a860eb6f425c4"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.485050 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ch9dc" event={"ID":"f390d60f-9967-4869-b09f-3cea4570186e","Type":"ContainerStarted","Data":"5fbdbd73d47771816116bdfa56d33c2a46a6b2e86c66518ae3fbc9ff5e5493d2"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.490276 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b","Type":"ContainerStarted","Data":"482479ce04d5f833f81d72dc75f610bb8c7a4314fd274b583ad244727ea260f6"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.492245 4903 generic.go:334] "Generic (PLEG): container finished" podID="23a7dc3f-9bad-4898-82c7-203ddf385577" containerID="072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a" exitCode=0 Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.493458 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" event={"ID":"23a7dc3f-9bad-4898-82c7-203ddf385577","Type":"ContainerDied","Data":"072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.493486 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" event={"ID":"23a7dc3f-9bad-4898-82c7-203ddf385577","Type":"ContainerStarted","Data":"15d075991e4d13689954070fd2191a6eb2bac25bebb42fd4c64cadc550e6ce1c"} Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.504419 4903 scope.go:117] "RemoveContainer" containerID="84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.540237 4903 scope.go:117] "RemoveContainer" containerID="23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d" Mar 20 08:43:58 crc kubenswrapper[4903]: E0320 08:43:58.541191 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d\": container with ID starting with 23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d not found: ID does not exist" containerID="23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.541254 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d"} err="failed to get container status \"23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d\": rpc error: code = NotFound desc = could not find container \"23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d\": container with ID starting with 23bedc16d61bdb994fa9a1e870a34f337f28c71f62f9776a75b86f2b40e6068d not found: ID does not exist" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.541286 4903 scope.go:117] "RemoveContainer" containerID="84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e" Mar 20 08:43:58 crc kubenswrapper[4903]: E0320 08:43:58.542151 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e\": container with ID starting with 84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e not found: ID does not exist" containerID="84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.542211 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e"} err="failed to get container status \"84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e\": rpc error: code = NotFound desc = could not find container \"84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e\": container with ID starting with 84fa0f44348f75975a81384033b6a645364aaae3c425fdc53caea28c8b3bef2e not found: ID does not exist" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.542243 4903 scope.go:117] "RemoveContainer" containerID="5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.558024 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ch9dc" podStartSLOduration=2.557992038 podStartE2EDuration="2.557992038s" podCreationTimestamp="2026-03-20 08:43:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:43:58.550007074 +0000 UTC m=+1263.766907389" watchObservedRunningTime="2026-03-20 08:43:58.557992038 +0000 UTC m=+1263.774892363" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.587017 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s7lkc"] Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.593609 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s7lkc"] Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.609163 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7wjsx"] Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.614302 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-7wjsx"] Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.624857 4903 scope.go:117] "RemoveContainer" containerID="5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.670300 4903 scope.go:117] "RemoveContainer" containerID="5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c" Mar 20 08:43:58 crc kubenswrapper[4903]: E0320 08:43:58.674147 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c\": container with ID starting with 5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c not found: ID does not exist" containerID="5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.674181 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c"} err="failed to get container status \"5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c\": rpc error: code = NotFound desc = could not find container \"5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c\": container with ID starting with 5396c93a5f4b2eab3e7f1e45e88eb26bba78fa715bbf2e3a67f739a3494f915c not found: ID does not exist" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.674204 4903 scope.go:117] "RemoveContainer" containerID="5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b" Mar 20 08:43:58 crc kubenswrapper[4903]: E0320 08:43:58.675956 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b\": container with ID starting with 5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b not found: ID does not exist" containerID="5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b" Mar 20 08:43:58 crc kubenswrapper[4903]: I0320 08:43:58.676018 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b"} err="failed to get container status \"5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b\": rpc error: code = NotFound desc = could not find container \"5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b\": container with ID starting with 5162b7b488fca429f42fa4bdaef26465012b9df7e4f3d77b2ea3bfd3abba483b not found: ID does not exist" Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.341875 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.342305 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.443333 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.516710 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" path="/var/lib/kubelet/pods/4b2ae24d-7a7f-450d-b9f0-29773070bfba/volumes" Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.517572 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f5e312-d8c8-420d-8655-8ace1519bdda" path="/var/lib/kubelet/pods/57f5e312-d8c8-420d-8655-8ace1519bdda/volumes" Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.521491 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" event={"ID":"23a7dc3f-9bad-4898-82c7-203ddf385577","Type":"ContainerStarted","Data":"754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964"} Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.523208 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.532018 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" event={"ID":"9e171236-e1e4-41f8-bf89-d162cab4d02b","Type":"ContainerStarted","Data":"0bbfa031f2f30749f12e2322fef6843f9e23c528e4d70527fcb872d46225042f"} Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.556069 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" podStartSLOduration=2.556000486 podStartE2EDuration="2.556000486s" podCreationTimestamp="2026-03-20 08:43:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:43:59.545812488 +0000 UTC m=+1264.762712803" watchObservedRunningTime="2026-03-20 08:43:59.556000486 +0000 UTC m=+1264.772900801" Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.583758 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" podStartSLOduration=3.583739569 podStartE2EDuration="3.583739569s" podCreationTimestamp="2026-03-20 08:43:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:43:59.573577331 +0000 UTC m=+1264.790477646" watchObservedRunningTime="2026-03-20 08:43:59.583739569 +0000 UTC m=+1264.800639874" Mar 20 08:43:59 crc kubenswrapper[4903]: I0320 08:43:59.651672 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.151512 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566604-q559d"] Mar 20 08:44:00 crc kubenswrapper[4903]: E0320 08:44:00.151863 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" containerName="init" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.151876 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" containerName="init" Mar 20 08:44:00 crc kubenswrapper[4903]: E0320 08:44:00.151904 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f5e312-d8c8-420d-8655-8ace1519bdda" containerName="init" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.151910 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f5e312-d8c8-420d-8655-8ace1519bdda" containerName="init" Mar 20 08:44:00 crc kubenswrapper[4903]: E0320 08:44:00.151936 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f5e312-d8c8-420d-8655-8ace1519bdda" containerName="dnsmasq-dns" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.151945 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f5e312-d8c8-420d-8655-8ace1519bdda" containerName="dnsmasq-dns" Mar 20 08:44:00 crc kubenswrapper[4903]: E0320 08:44:00.151960 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" containerName="dnsmasq-dns" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.151966 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" containerName="dnsmasq-dns" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.152154 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2ae24d-7a7f-450d-b9f0-29773070bfba" containerName="dnsmasq-dns" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.152178 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f5e312-d8c8-420d-8655-8ace1519bdda" containerName="dnsmasq-dns" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.152687 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566604-q559d" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.154832 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.155789 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.156621 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.161406 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566604-q559d"] Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.190569 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb4qg\" (UniqueName: \"kubernetes.io/projected/1dacbc55-264e-4175-a03b-f1f4d135ef5e-kube-api-access-sb4qg\") pod \"auto-csr-approver-29566604-q559d\" (UID: \"1dacbc55-264e-4175-a03b-f1f4d135ef5e\") " pod="openshift-infra/auto-csr-approver-29566604-q559d" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.292290 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb4qg\" (UniqueName: \"kubernetes.io/projected/1dacbc55-264e-4175-a03b-f1f4d135ef5e-kube-api-access-sb4qg\") pod \"auto-csr-approver-29566604-q559d\" (UID: \"1dacbc55-264e-4175-a03b-f1f4d135ef5e\") " pod="openshift-infra/auto-csr-approver-29566604-q559d" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.315121 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb4qg\" (UniqueName: \"kubernetes.io/projected/1dacbc55-264e-4175-a03b-f1f4d135ef5e-kube-api-access-sb4qg\") pod \"auto-csr-approver-29566604-q559d\" (UID: \"1dacbc55-264e-4175-a03b-f1f4d135ef5e\") " pod="openshift-infra/auto-csr-approver-29566604-q559d" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.472998 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566604-q559d" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.542624 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:44:00 crc kubenswrapper[4903]: I0320 08:44:00.721337 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566604-q559d"] Mar 20 08:44:01 crc kubenswrapper[4903]: I0320 08:44:01.552569 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566604-q559d" event={"ID":"1dacbc55-264e-4175-a03b-f1f4d135ef5e","Type":"ContainerStarted","Data":"be53bb48cf2d9b77b502cd40543eb429f91d2117fa51dea123ceb1cf80d97ab7"} Mar 20 08:44:01 crc kubenswrapper[4903]: I0320 08:44:01.962651 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.219843 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rjxxk"] Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.305424 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqltz"] Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.306613 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.349196 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqltz"] Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.433291 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.433364 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f4q8\" (UniqueName: \"kubernetes.io/projected/6de2d5fb-9f92-4a35-8264-48353a33895a-kube-api-access-6f4q8\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.433397 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.433430 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-dns-svc\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.433447 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-config\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.534893 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.534964 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-dns-svc\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.534995 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-config\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.535100 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.535185 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f4q8\" (UniqueName: \"kubernetes.io/projected/6de2d5fb-9f92-4a35-8264-48353a33895a-kube-api-access-6f4q8\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.537320 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-config\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.537374 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-dns-svc\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.537451 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.537563 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.555259 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f4q8\" (UniqueName: \"kubernetes.io/projected/6de2d5fb-9f92-4a35-8264-48353a33895a-kube-api-access-6f4q8\") pod \"dnsmasq-dns-698758b865-gqltz\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.561114 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" podUID="9e171236-e1e4-41f8-bf89-d162cab4d02b" containerName="dnsmasq-dns" containerID="cri-o://0bbfa031f2f30749f12e2322fef6843f9e23c528e4d70527fcb872d46225042f" gracePeriod=10 Mar 20 08:44:02 crc kubenswrapper[4903]: I0320 08:44:02.625267 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.110446 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqltz"] Mar 20 08:44:03 crc kubenswrapper[4903]: W0320 08:44:03.114960 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6de2d5fb_9f92_4a35_8264_48353a33895a.slice/crio-0c99ff6c0a12334d6ed71d2c74055d0ef8ff18f223b4e69bfcb7136ff560aeaa WatchSource:0}: Error finding container 0c99ff6c0a12334d6ed71d2c74055d0ef8ff18f223b4e69bfcb7136ff560aeaa: Status 404 returned error can't find the container with id 0c99ff6c0a12334d6ed71d2c74055d0ef8ff18f223b4e69bfcb7136ff560aeaa Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.390961 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.399771 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.403135 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rl9z4" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.403383 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.403547 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.404207 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.420822 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.557614 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-cache\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.557718 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-lock\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.557834 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.558215 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvm8x\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-kube-api-access-tvm8x\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.558306 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.559142 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccedd84e-d0d0-40b8-812c-3a57b41aee98-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.570303 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqltz" event={"ID":"6de2d5fb-9f92-4a35-8264-48353a33895a","Type":"ContainerStarted","Data":"0c99ff6c0a12334d6ed71d2c74055d0ef8ff18f223b4e69bfcb7136ff560aeaa"} Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.661304 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccedd84e-d0d0-40b8-812c-3a57b41aee98-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.661505 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-cache\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.661570 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-lock\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.661655 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.661691 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvm8x\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-kube-api-access-tvm8x\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.661739 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: E0320 08:44:03.662072 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:44:03 crc kubenswrapper[4903]: E0320 08:44:03.662105 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 08:44:03 crc kubenswrapper[4903]: E0320 08:44:03.662201 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:04.162164495 +0000 UTC m=+1269.379064850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : configmap "swift-ring-files" not found Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.662324 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-cache\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.662350 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-lock\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.662363 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.670966 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccedd84e-d0d0-40b8-812c-3a57b41aee98-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.686275 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvm8x\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-kube-api-access-tvm8x\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:03 crc kubenswrapper[4903]: I0320 08:44:03.711248 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:04 crc kubenswrapper[4903]: I0320 08:44:04.170754 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:04 crc kubenswrapper[4903]: E0320 08:44:04.170944 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:44:04 crc kubenswrapper[4903]: E0320 08:44:04.171238 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 08:44:04 crc kubenswrapper[4903]: E0320 08:44:04.171320 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:05.17129397 +0000 UTC m=+1270.388194295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : configmap "swift-ring-files" not found Mar 20 08:44:05 crc kubenswrapper[4903]: I0320 08:44:05.111505 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 08:44:05 crc kubenswrapper[4903]: I0320 08:44:05.188492 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:05 crc kubenswrapper[4903]: E0320 08:44:05.188844 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:44:05 crc kubenswrapper[4903]: E0320 08:44:05.188888 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 08:44:05 crc kubenswrapper[4903]: E0320 08:44:05.188980 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:07.188944745 +0000 UTC m=+1272.405845100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : configmap "swift-ring-files" not found Mar 20 08:44:05 crc kubenswrapper[4903]: I0320 08:44:05.226804 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="96a68183-d440-4f89-887d-d2441d00c8e4" containerName="galera" probeResult="failure" output=< Mar 20 08:44:05 crc kubenswrapper[4903]: wsrep_local_state_comment (Joined) differs from Synced Mar 20 08:44:05 crc kubenswrapper[4903]: > Mar 20 08:44:05 crc kubenswrapper[4903]: I0320 08:44:05.602848 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b","Type":"ContainerStarted","Data":"5704e6c6ea3db74005a8e1d1aeb869f6812338abc7af8b7e741fc45d5338477c"} Mar 20 08:44:05 crc kubenswrapper[4903]: I0320 08:44:05.609911 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqltz" event={"ID":"6de2d5fb-9f92-4a35-8264-48353a33895a","Type":"ContainerStarted","Data":"ee22835e173d77a00221bda8da6f1d27eacb21f1253f3657c3ecf9a6739e4cb1"} Mar 20 08:44:05 crc kubenswrapper[4903]: I0320 08:44:05.628364 4903 generic.go:334] "Generic (PLEG): container finished" podID="9e171236-e1e4-41f8-bf89-d162cab4d02b" containerID="0bbfa031f2f30749f12e2322fef6843f9e23c528e4d70527fcb872d46225042f" exitCode=0 Mar 20 08:44:05 crc kubenswrapper[4903]: I0320 08:44:05.628411 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" event={"ID":"9e171236-e1e4-41f8-bf89-d162cab4d02b","Type":"ContainerDied","Data":"0bbfa031f2f30749f12e2322fef6843f9e23c528e4d70527fcb872d46225042f"} Mar 20 08:44:06 crc kubenswrapper[4903]: I0320 08:44:06.641189 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b","Type":"ContainerStarted","Data":"7125bc754a8c0e626fcd2fe281119d9040b278b420956ad513958b106967fd43"} Mar 20 08:44:06 crc kubenswrapper[4903]: I0320 08:44:06.642219 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 08:44:06 crc kubenswrapper[4903]: I0320 08:44:06.645944 4903 generic.go:334] "Generic (PLEG): container finished" podID="1dacbc55-264e-4175-a03b-f1f4d135ef5e" containerID="3f80ef0529df9f4da7f79cb6a6b5e235388ced9a4a663185ccd21fd5f820b009" exitCode=0 Mar 20 08:44:06 crc kubenswrapper[4903]: I0320 08:44:06.646060 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566604-q559d" event={"ID":"1dacbc55-264e-4175-a03b-f1f4d135ef5e","Type":"ContainerDied","Data":"3f80ef0529df9f4da7f79cb6a6b5e235388ced9a4a663185ccd21fd5f820b009"} Mar 20 08:44:06 crc kubenswrapper[4903]: I0320 08:44:06.647883 4903 generic.go:334] "Generic (PLEG): container finished" podID="6de2d5fb-9f92-4a35-8264-48353a33895a" containerID="ee22835e173d77a00221bda8da6f1d27eacb21f1253f3657c3ecf9a6739e4cb1" exitCode=0 Mar 20 08:44:06 crc kubenswrapper[4903]: I0320 08:44:06.647925 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqltz" event={"ID":"6de2d5fb-9f92-4a35-8264-48353a33895a","Type":"ContainerDied","Data":"ee22835e173d77a00221bda8da6f1d27eacb21f1253f3657c3ecf9a6739e4cb1"} Mar 20 08:44:06 crc kubenswrapper[4903]: I0320 08:44:06.669906 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=8.406779876 podStartE2EDuration="9.669872722s" podCreationTimestamp="2026-03-20 08:43:57 +0000 UTC" firstStartedPulling="2026-03-20 08:43:58.129462647 +0000 UTC m=+1263.346362962" lastFinishedPulling="2026-03-20 08:43:59.392555453 +0000 UTC m=+1264.609455808" observedRunningTime="2026-03-20 08:44:06.669275357 +0000 UTC m=+1271.886175682" watchObservedRunningTime="2026-03-20 08:44:06.669872722 +0000 UTC m=+1271.886773037" Mar 20 08:44:06 crc kubenswrapper[4903]: I0320 08:44:06.857445 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.040927 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jcqb\" (UniqueName: \"kubernetes.io/projected/9e171236-e1e4-41f8-bf89-d162cab4d02b-kube-api-access-4jcqb\") pod \"9e171236-e1e4-41f8-bf89-d162cab4d02b\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.041488 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-config\") pod \"9e171236-e1e4-41f8-bf89-d162cab4d02b\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.041561 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-ovsdbserver-nb\") pod \"9e171236-e1e4-41f8-bf89-d162cab4d02b\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.041840 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-dns-svc\") pod \"9e171236-e1e4-41f8-bf89-d162cab4d02b\" (UID: \"9e171236-e1e4-41f8-bf89-d162cab4d02b\") " Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.053392 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e171236-e1e4-41f8-bf89-d162cab4d02b-kube-api-access-4jcqb" (OuterVolumeSpecName: "kube-api-access-4jcqb") pod "9e171236-e1e4-41f8-bf89-d162cab4d02b" (UID: "9e171236-e1e4-41f8-bf89-d162cab4d02b"). InnerVolumeSpecName "kube-api-access-4jcqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.095560 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e171236-e1e4-41f8-bf89-d162cab4d02b" (UID: "9e171236-e1e4-41f8-bf89-d162cab4d02b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.096376 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-config" (OuterVolumeSpecName: "config") pod "9e171236-e1e4-41f8-bf89-d162cab4d02b" (UID: "9e171236-e1e4-41f8-bf89-d162cab4d02b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.115027 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e171236-e1e4-41f8-bf89-d162cab4d02b" (UID: "9e171236-e1e4-41f8-bf89-d162cab4d02b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.143924 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jcqb\" (UniqueName: \"kubernetes.io/projected/9e171236-e1e4-41f8-bf89-d162cab4d02b-kube-api-access-4jcqb\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.144751 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.144861 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.144922 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e171236-e1e4-41f8-bf89-d162cab4d02b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.166215 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-25k6h"] Mar 20 08:44:07 crc kubenswrapper[4903]: E0320 08:44:07.166632 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e171236-e1e4-41f8-bf89-d162cab4d02b" containerName="init" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.166658 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e171236-e1e4-41f8-bf89-d162cab4d02b" containerName="init" Mar 20 08:44:07 crc kubenswrapper[4903]: E0320 08:44:07.166703 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e171236-e1e4-41f8-bf89-d162cab4d02b" containerName="dnsmasq-dns" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.166712 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e171236-e1e4-41f8-bf89-d162cab4d02b" containerName="dnsmasq-dns" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.166972 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e171236-e1e4-41f8-bf89-d162cab4d02b" containerName="dnsmasq-dns" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.167676 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.171874 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.171981 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.172003 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.186553 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-25k6h"] Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.246738 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-ring-data-devices\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.246794 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1642398c-5346-421f-86c2-baec0001304e-etc-swift\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.246818 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-scripts\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.246877 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-dispersionconf\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.246912 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-swiftconf\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.247013 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.247067 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-combined-ca-bundle\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.247118 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlqw5\" (UniqueName: \"kubernetes.io/projected/1642398c-5346-421f-86c2-baec0001304e-kube-api-access-nlqw5\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: E0320 08:44:07.247272 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:44:07 crc kubenswrapper[4903]: E0320 08:44:07.247293 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 08:44:07 crc kubenswrapper[4903]: E0320 08:44:07.247346 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:11.247324483 +0000 UTC m=+1276.464224798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : configmap "swift-ring-files" not found Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.350825 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-combined-ca-bundle\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.350934 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlqw5\" (UniqueName: \"kubernetes.io/projected/1642398c-5346-421f-86c2-baec0001304e-kube-api-access-nlqw5\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.350995 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1642398c-5346-421f-86c2-baec0001304e-etc-swift\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.351025 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-ring-data-devices\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.351123 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-scripts\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.351303 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-dispersionconf\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.351840 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-swiftconf\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.351984 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1642398c-5346-421f-86c2-baec0001304e-etc-swift\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.352555 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-scripts\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.352583 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-ring-data-devices\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.356697 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-dispersionconf\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.357509 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-combined-ca-bundle\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.359274 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-swiftconf\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.377148 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlqw5\" (UniqueName: \"kubernetes.io/projected/1642398c-5346-421f-86c2-baec0001304e-kube-api-access-nlqw5\") pod \"swift-ring-rebalance-25k6h\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.399184 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.501894 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.661126 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqltz" event={"ID":"6de2d5fb-9f92-4a35-8264-48353a33895a","Type":"ContainerStarted","Data":"61c80e20e2f5f5e7a08057fdd53cd07ee7d988b7e195a4b93be668970fa183e1"} Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.662111 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.664876 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.665586 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-rjxxk" event={"ID":"9e171236-e1e4-41f8-bf89-d162cab4d02b","Type":"ContainerDied","Data":"bdff9c190cc14ae77a67d8cf3999e051d749ceb99270dade70e0ae12715f410e"} Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.665625 4903 scope.go:117] "RemoveContainer" containerID="0bbfa031f2f30749f12e2322fef6843f9e23c528e4d70527fcb872d46225042f" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.686355 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-gqltz" podStartSLOduration=5.686339318 podStartE2EDuration="5.686339318s" podCreationTimestamp="2026-03-20 08:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:44:07.683670203 +0000 UTC m=+1272.900570538" watchObservedRunningTime="2026-03-20 08:44:07.686339318 +0000 UTC m=+1272.903239633" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.756279 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rjxxk"] Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.777617 4903 scope.go:117] "RemoveContainer" containerID="1c7859a11f841171a5152e58a2350792815cc733a25210225acfc3d6af166932" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.780997 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rjxxk"] Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.965154 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 08:44:07 crc kubenswrapper[4903]: I0320 08:44:07.983381 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-25k6h"] Mar 20 08:44:08 crc kubenswrapper[4903]: W0320 08:44:08.002047 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1642398c_5346_421f_86c2_baec0001304e.slice/crio-ff71b3c6c64e1195b9ccc14c1d44fb626cf59538ad3fe49ad52cb508f5e726cd WatchSource:0}: Error finding container ff71b3c6c64e1195b9ccc14c1d44fb626cf59538ad3fe49ad52cb508f5e726cd: Status 404 returned error can't find the container with id ff71b3c6c64e1195b9ccc14c1d44fb626cf59538ad3fe49ad52cb508f5e726cd Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.077256 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-r7v7q"] Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.081854 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r7v7q" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.084641 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.088842 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-r7v7q"] Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.102048 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566604-q559d" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.269677 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb4qg\" (UniqueName: \"kubernetes.io/projected/1dacbc55-264e-4175-a03b-f1f4d135ef5e-kube-api-access-sb4qg\") pod \"1dacbc55-264e-4175-a03b-f1f4d135ef5e\" (UID: \"1dacbc55-264e-4175-a03b-f1f4d135ef5e\") " Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.269998 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv5wx\" (UniqueName: \"kubernetes.io/projected/c2137443-2f65-4c2b-87d2-8886f55a7ce7-kube-api-access-lv5wx\") pod \"root-account-create-update-r7v7q\" (UID: \"c2137443-2f65-4c2b-87d2-8886f55a7ce7\") " pod="openstack/root-account-create-update-r7v7q" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.270072 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2137443-2f65-4c2b-87d2-8886f55a7ce7-operator-scripts\") pod \"root-account-create-update-r7v7q\" (UID: \"c2137443-2f65-4c2b-87d2-8886f55a7ce7\") " pod="openstack/root-account-create-update-r7v7q" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.283448 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dacbc55-264e-4175-a03b-f1f4d135ef5e-kube-api-access-sb4qg" (OuterVolumeSpecName: "kube-api-access-sb4qg") pod "1dacbc55-264e-4175-a03b-f1f4d135ef5e" (UID: "1dacbc55-264e-4175-a03b-f1f4d135ef5e"). InnerVolumeSpecName "kube-api-access-sb4qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.371844 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv5wx\" (UniqueName: \"kubernetes.io/projected/c2137443-2f65-4c2b-87d2-8886f55a7ce7-kube-api-access-lv5wx\") pod \"root-account-create-update-r7v7q\" (UID: \"c2137443-2f65-4c2b-87d2-8886f55a7ce7\") " pod="openstack/root-account-create-update-r7v7q" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.371963 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2137443-2f65-4c2b-87d2-8886f55a7ce7-operator-scripts\") pod \"root-account-create-update-r7v7q\" (UID: \"c2137443-2f65-4c2b-87d2-8886f55a7ce7\") " pod="openstack/root-account-create-update-r7v7q" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.372074 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb4qg\" (UniqueName: \"kubernetes.io/projected/1dacbc55-264e-4175-a03b-f1f4d135ef5e-kube-api-access-sb4qg\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.372707 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2137443-2f65-4c2b-87d2-8886f55a7ce7-operator-scripts\") pod \"root-account-create-update-r7v7q\" (UID: \"c2137443-2f65-4c2b-87d2-8886f55a7ce7\") " pod="openstack/root-account-create-update-r7v7q" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.392623 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv5wx\" (UniqueName: \"kubernetes.io/projected/c2137443-2f65-4c2b-87d2-8886f55a7ce7-kube-api-access-lv5wx\") pod \"root-account-create-update-r7v7q\" (UID: \"c2137443-2f65-4c2b-87d2-8886f55a7ce7\") " pod="openstack/root-account-create-update-r7v7q" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.413806 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r7v7q" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.685295 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25k6h" event={"ID":"1642398c-5346-421f-86c2-baec0001304e","Type":"ContainerStarted","Data":"ff71b3c6c64e1195b9ccc14c1d44fb626cf59538ad3fe49ad52cb508f5e726cd"} Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.690502 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566604-q559d" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.691012 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566604-q559d" event={"ID":"1dacbc55-264e-4175-a03b-f1f4d135ef5e","Type":"ContainerDied","Data":"be53bb48cf2d9b77b502cd40543eb429f91d2117fa51dea123ceb1cf80d97ab7"} Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.691092 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be53bb48cf2d9b77b502cd40543eb429f91d2117fa51dea123ceb1cf80d97ab7" Mar 20 08:44:08 crc kubenswrapper[4903]: I0320 08:44:08.905673 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-r7v7q"] Mar 20 08:44:08 crc kubenswrapper[4903]: W0320 08:44:08.916910 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2137443_2f65_4c2b_87d2_8886f55a7ce7.slice/crio-5d3d9cbeeb91125089f826bb081845164c06ba92ff6939e6a15fc813bd863401 WatchSource:0}: Error finding container 5d3d9cbeeb91125089f826bb081845164c06ba92ff6939e6a15fc813bd863401: Status 404 returned error can't find the container with id 5d3d9cbeeb91125089f826bb081845164c06ba92ff6939e6a15fc813bd863401 Mar 20 08:44:09 crc kubenswrapper[4903]: I0320 08:44:09.179052 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566598-fjtx9"] Mar 20 08:44:09 crc kubenswrapper[4903]: I0320 08:44:09.190212 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566598-fjtx9"] Mar 20 08:44:09 crc kubenswrapper[4903]: I0320 08:44:09.505874 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04aba9e4-217f-489a-9c45-dde90c447aa6" path="/var/lib/kubelet/pods/04aba9e4-217f-489a-9c45-dde90c447aa6/volumes" Mar 20 08:44:09 crc kubenswrapper[4903]: I0320 08:44:09.507253 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e171236-e1e4-41f8-bf89-d162cab4d02b" path="/var/lib/kubelet/pods/9e171236-e1e4-41f8-bf89-d162cab4d02b/volumes" Mar 20 08:44:09 crc kubenswrapper[4903]: I0320 08:44:09.707887 4903 generic.go:334] "Generic (PLEG): container finished" podID="c2137443-2f65-4c2b-87d2-8886f55a7ce7" containerID="6bedc484fc1ddfc3818fe8a5b573155e069bccef17ec805a7bdd654f8220e9b8" exitCode=0 Mar 20 08:44:09 crc kubenswrapper[4903]: I0320 08:44:09.707960 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r7v7q" event={"ID":"c2137443-2f65-4c2b-87d2-8886f55a7ce7","Type":"ContainerDied","Data":"6bedc484fc1ddfc3818fe8a5b573155e069bccef17ec805a7bdd654f8220e9b8"} Mar 20 08:44:09 crc kubenswrapper[4903]: I0320 08:44:09.708382 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r7v7q" event={"ID":"c2137443-2f65-4c2b-87d2-8886f55a7ce7","Type":"ContainerStarted","Data":"5d3d9cbeeb91125089f826bb081845164c06ba92ff6939e6a15fc813bd863401"} Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.570847 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-m4hwk"] Mar 20 08:44:10 crc kubenswrapper[4903]: E0320 08:44:10.571798 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dacbc55-264e-4175-a03b-f1f4d135ef5e" containerName="oc" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.571820 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dacbc55-264e-4175-a03b-f1f4d135ef5e" containerName="oc" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.572180 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dacbc55-264e-4175-a03b-f1f4d135ef5e" containerName="oc" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.573184 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-m4hwk" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.592361 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-m4hwk"] Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.642991 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7068-account-create-update-65l92"] Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.644465 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7068-account-create-update-65l92" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.647816 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.648905 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7068-account-create-update-65l92"] Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.729843 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px2s5\" (UniqueName: \"kubernetes.io/projected/01f22901-e7ea-44d3-bc58-7efffaa493ad-kube-api-access-px2s5\") pod \"keystone-7068-account-create-update-65l92\" (UID: \"01f22901-e7ea-44d3-bc58-7efffaa493ad\") " pod="openstack/keystone-7068-account-create-update-65l92" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.729908 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f22901-e7ea-44d3-bc58-7efffaa493ad-operator-scripts\") pod \"keystone-7068-account-create-update-65l92\" (UID: \"01f22901-e7ea-44d3-bc58-7efffaa493ad\") " pod="openstack/keystone-7068-account-create-update-65l92" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.729967 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcrvd\" (UniqueName: \"kubernetes.io/projected/54015026-e605-4781-a02e-57b47c61284e-kube-api-access-jcrvd\") pod \"keystone-db-create-m4hwk\" (UID: \"54015026-e605-4781-a02e-57b47c61284e\") " pod="openstack/keystone-db-create-m4hwk" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.730025 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54015026-e605-4781-a02e-57b47c61284e-operator-scripts\") pod \"keystone-db-create-m4hwk\" (UID: \"54015026-e605-4781-a02e-57b47c61284e\") " pod="openstack/keystone-db-create-m4hwk" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.747680 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hwngf"] Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.749538 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hwngf" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.758899 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hwngf"] Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.832118 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx955\" (UniqueName: \"kubernetes.io/projected/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-kube-api-access-sx955\") pod \"placement-db-create-hwngf\" (UID: \"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f\") " pod="openstack/placement-db-create-hwngf" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.832179 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-operator-scripts\") pod \"placement-db-create-hwngf\" (UID: \"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f\") " pod="openstack/placement-db-create-hwngf" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.832210 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcrvd\" (UniqueName: \"kubernetes.io/projected/54015026-e605-4781-a02e-57b47c61284e-kube-api-access-jcrvd\") pod \"keystone-db-create-m4hwk\" (UID: \"54015026-e605-4781-a02e-57b47c61284e\") " pod="openstack/keystone-db-create-m4hwk" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.832351 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54015026-e605-4781-a02e-57b47c61284e-operator-scripts\") pod \"keystone-db-create-m4hwk\" (UID: \"54015026-e605-4781-a02e-57b47c61284e\") " pod="openstack/keystone-db-create-m4hwk" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.832416 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px2s5\" (UniqueName: \"kubernetes.io/projected/01f22901-e7ea-44d3-bc58-7efffaa493ad-kube-api-access-px2s5\") pod \"keystone-7068-account-create-update-65l92\" (UID: \"01f22901-e7ea-44d3-bc58-7efffaa493ad\") " pod="openstack/keystone-7068-account-create-update-65l92" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.832447 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f22901-e7ea-44d3-bc58-7efffaa493ad-operator-scripts\") pod \"keystone-7068-account-create-update-65l92\" (UID: \"01f22901-e7ea-44d3-bc58-7efffaa493ad\") " pod="openstack/keystone-7068-account-create-update-65l92" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.833481 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f22901-e7ea-44d3-bc58-7efffaa493ad-operator-scripts\") pod \"keystone-7068-account-create-update-65l92\" (UID: \"01f22901-e7ea-44d3-bc58-7efffaa493ad\") " pod="openstack/keystone-7068-account-create-update-65l92" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.833487 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54015026-e605-4781-a02e-57b47c61284e-operator-scripts\") pod \"keystone-db-create-m4hwk\" (UID: \"54015026-e605-4781-a02e-57b47c61284e\") " pod="openstack/keystone-db-create-m4hwk" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.860017 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px2s5\" (UniqueName: \"kubernetes.io/projected/01f22901-e7ea-44d3-bc58-7efffaa493ad-kube-api-access-px2s5\") pod \"keystone-7068-account-create-update-65l92\" (UID: \"01f22901-e7ea-44d3-bc58-7efffaa493ad\") " pod="openstack/keystone-7068-account-create-update-65l92" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.862437 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-08ee-account-create-update-s27bz"] Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.864133 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcrvd\" (UniqueName: \"kubernetes.io/projected/54015026-e605-4781-a02e-57b47c61284e-kube-api-access-jcrvd\") pod \"keystone-db-create-m4hwk\" (UID: \"54015026-e605-4781-a02e-57b47c61284e\") " pod="openstack/keystone-db-create-m4hwk" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.864156 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08ee-account-create-update-s27bz" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.866934 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.870566 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-08ee-account-create-update-s27bz"] Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.900555 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-m4hwk" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.934274 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx955\" (UniqueName: \"kubernetes.io/projected/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-kube-api-access-sx955\") pod \"placement-db-create-hwngf\" (UID: \"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f\") " pod="openstack/placement-db-create-hwngf" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.934401 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-operator-scripts\") pod \"placement-db-create-hwngf\" (UID: \"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f\") " pod="openstack/placement-db-create-hwngf" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.934470 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e03f215f-5dde-4e62-82c4-9ab840f6223a-operator-scripts\") pod \"placement-08ee-account-create-update-s27bz\" (UID: \"e03f215f-5dde-4e62-82c4-9ab840f6223a\") " pod="openstack/placement-08ee-account-create-update-s27bz" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.934729 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7vjs\" (UniqueName: \"kubernetes.io/projected/e03f215f-5dde-4e62-82c4-9ab840f6223a-kube-api-access-v7vjs\") pod \"placement-08ee-account-create-update-s27bz\" (UID: \"e03f215f-5dde-4e62-82c4-9ab840f6223a\") " pod="openstack/placement-08ee-account-create-update-s27bz" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.935513 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-operator-scripts\") pod \"placement-db-create-hwngf\" (UID: \"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f\") " pod="openstack/placement-db-create-hwngf" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.954649 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx955\" (UniqueName: \"kubernetes.io/projected/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-kube-api-access-sx955\") pod \"placement-db-create-hwngf\" (UID: \"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f\") " pod="openstack/placement-db-create-hwngf" Mar 20 08:44:10 crc kubenswrapper[4903]: I0320 08:44:10.980512 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7068-account-create-update-65l92" Mar 20 08:44:11 crc kubenswrapper[4903]: I0320 08:44:11.044943 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e03f215f-5dde-4e62-82c4-9ab840f6223a-operator-scripts\") pod \"placement-08ee-account-create-update-s27bz\" (UID: \"e03f215f-5dde-4e62-82c4-9ab840f6223a\") " pod="openstack/placement-08ee-account-create-update-s27bz" Mar 20 08:44:11 crc kubenswrapper[4903]: I0320 08:44:11.045086 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7vjs\" (UniqueName: \"kubernetes.io/projected/e03f215f-5dde-4e62-82c4-9ab840f6223a-kube-api-access-v7vjs\") pod \"placement-08ee-account-create-update-s27bz\" (UID: \"e03f215f-5dde-4e62-82c4-9ab840f6223a\") " pod="openstack/placement-08ee-account-create-update-s27bz" Mar 20 08:44:11 crc kubenswrapper[4903]: I0320 08:44:11.046648 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e03f215f-5dde-4e62-82c4-9ab840f6223a-operator-scripts\") pod \"placement-08ee-account-create-update-s27bz\" (UID: \"e03f215f-5dde-4e62-82c4-9ab840f6223a\") " pod="openstack/placement-08ee-account-create-update-s27bz" Mar 20 08:44:11 crc kubenswrapper[4903]: I0320 08:44:11.078771 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hwngf" Mar 20 08:44:11 crc kubenswrapper[4903]: I0320 08:44:11.100181 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7vjs\" (UniqueName: \"kubernetes.io/projected/e03f215f-5dde-4e62-82c4-9ab840f6223a-kube-api-access-v7vjs\") pod \"placement-08ee-account-create-update-s27bz\" (UID: \"e03f215f-5dde-4e62-82c4-9ab840f6223a\") " pod="openstack/placement-08ee-account-create-update-s27bz" Mar 20 08:44:11 crc kubenswrapper[4903]: I0320 08:44:11.248823 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08ee-account-create-update-s27bz" Mar 20 08:44:11 crc kubenswrapper[4903]: I0320 08:44:11.250403 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:11 crc kubenswrapper[4903]: E0320 08:44:11.250582 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:44:11 crc kubenswrapper[4903]: E0320 08:44:11.250604 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 08:44:11 crc kubenswrapper[4903]: E0320 08:44:11.250662 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:19.250636098 +0000 UTC m=+1284.467536413 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : configmap "swift-ring-files" not found Mar 20 08:44:11 crc kubenswrapper[4903]: I0320 08:44:11.986007 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r7v7q" Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.064954 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2137443-2f65-4c2b-87d2-8886f55a7ce7-operator-scripts\") pod \"c2137443-2f65-4c2b-87d2-8886f55a7ce7\" (UID: \"c2137443-2f65-4c2b-87d2-8886f55a7ce7\") " Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.065642 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv5wx\" (UniqueName: \"kubernetes.io/projected/c2137443-2f65-4c2b-87d2-8886f55a7ce7-kube-api-access-lv5wx\") pod \"c2137443-2f65-4c2b-87d2-8886f55a7ce7\" (UID: \"c2137443-2f65-4c2b-87d2-8886f55a7ce7\") " Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.065978 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2137443-2f65-4c2b-87d2-8886f55a7ce7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2137443-2f65-4c2b-87d2-8886f55a7ce7" (UID: "c2137443-2f65-4c2b-87d2-8886f55a7ce7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.066384 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2137443-2f65-4c2b-87d2-8886f55a7ce7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.070893 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2137443-2f65-4c2b-87d2-8886f55a7ce7-kube-api-access-lv5wx" (OuterVolumeSpecName: "kube-api-access-lv5wx") pod "c2137443-2f65-4c2b-87d2-8886f55a7ce7" (UID: "c2137443-2f65-4c2b-87d2-8886f55a7ce7"). InnerVolumeSpecName "kube-api-access-lv5wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.171932 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv5wx\" (UniqueName: \"kubernetes.io/projected/c2137443-2f65-4c2b-87d2-8886f55a7ce7-kube-api-access-lv5wx\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.519415 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7068-account-create-update-65l92"] Mar 20 08:44:12 crc kubenswrapper[4903]: W0320 08:44:12.537891 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01f22901_e7ea_44d3_bc58_7efffaa493ad.slice/crio-d753e11e88a2af238c584ed57ebc698bf74a37d6f714bdbc164ef8ad7a0f9d4a WatchSource:0}: Error finding container d753e11e88a2af238c584ed57ebc698bf74a37d6f714bdbc164ef8ad7a0f9d4a: Status 404 returned error can't find the container with id d753e11e88a2af238c584ed57ebc698bf74a37d6f714bdbc164ef8ad7a0f9d4a Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.540804 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-08ee-account-create-update-s27bz"] Mar 20 08:44:12 crc kubenswrapper[4903]: W0320 08:44:12.543362 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode03f215f_5dde_4e62_82c4_9ab840f6223a.slice/crio-5a08145552271baeac5f2379fb9f1e2cda2261b8b0700f8393a2619afb08299e WatchSource:0}: Error finding container 5a08145552271baeac5f2379fb9f1e2cda2261b8b0700f8393a2619afb08299e: Status 404 returned error can't find the container with id 5a08145552271baeac5f2379fb9f1e2cda2261b8b0700f8393a2619afb08299e Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.631069 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.660508 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-m4hwk"] Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.689279 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hwngf"] Mar 20 08:44:12 crc kubenswrapper[4903]: W0320 08:44:12.690356 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf0235fb_bbd5_4f7e_8c25_8230f6e48f2f.slice/crio-ae827904357a766d5fdbeb3c908621e302a5995d621d744b50b3cf40844e125d WatchSource:0}: Error finding container ae827904357a766d5fdbeb3c908621e302a5995d621d744b50b3cf40844e125d: Status 404 returned error can't find the container with id ae827904357a766d5fdbeb3c908621e302a5995d621d744b50b3cf40844e125d Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.700864 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8ckgn"] Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.701138 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" podUID="23a7dc3f-9bad-4898-82c7-203ddf385577" containerName="dnsmasq-dns" containerID="cri-o://754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964" gracePeriod=10 Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.802550 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hwngf" event={"ID":"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f","Type":"ContainerStarted","Data":"ae827904357a766d5fdbeb3c908621e302a5995d621d744b50b3cf40844e125d"} Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.805174 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r7v7q" event={"ID":"c2137443-2f65-4c2b-87d2-8886f55a7ce7","Type":"ContainerDied","Data":"5d3d9cbeeb91125089f826bb081845164c06ba92ff6939e6a15fc813bd863401"} Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.805217 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3d9cbeeb91125089f826bb081845164c06ba92ff6939e6a15fc813bd863401" Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.805235 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r7v7q" Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.816067 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25k6h" event={"ID":"1642398c-5346-421f-86c2-baec0001304e","Type":"ContainerStarted","Data":"ae7ef07f6710e3c50df8fdfb347b54fac262a0797dc427c28c37c346ed91d089"} Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.817724 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-08ee-account-create-update-s27bz" event={"ID":"e03f215f-5dde-4e62-82c4-9ab840f6223a","Type":"ContainerStarted","Data":"5a08145552271baeac5f2379fb9f1e2cda2261b8b0700f8393a2619afb08299e"} Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.818762 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7068-account-create-update-65l92" event={"ID":"01f22901-e7ea-44d3-bc58-7efffaa493ad","Type":"ContainerStarted","Data":"d753e11e88a2af238c584ed57ebc698bf74a37d6f714bdbc164ef8ad7a0f9d4a"} Mar 20 08:44:12 crc kubenswrapper[4903]: I0320 08:44:12.851114 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-25k6h" podStartSLOduration=1.821312206 podStartE2EDuration="5.851085803s" podCreationTimestamp="2026-03-20 08:44:07 +0000 UTC" firstStartedPulling="2026-03-20 08:44:08.007109325 +0000 UTC m=+1273.224009640" lastFinishedPulling="2026-03-20 08:44:12.036882922 +0000 UTC m=+1277.253783237" observedRunningTime="2026-03-20 08:44:12.842483375 +0000 UTC m=+1278.059383700" watchObservedRunningTime="2026-03-20 08:44:12.851085803 +0000 UTC m=+1278.067986118" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.494081 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.604653 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-nb\") pod \"23a7dc3f-9bad-4898-82c7-203ddf385577\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.604816 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm6kz\" (UniqueName: \"kubernetes.io/projected/23a7dc3f-9bad-4898-82c7-203ddf385577-kube-api-access-dm6kz\") pod \"23a7dc3f-9bad-4898-82c7-203ddf385577\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.604859 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-config\") pod \"23a7dc3f-9bad-4898-82c7-203ddf385577\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.604968 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-sb\") pod \"23a7dc3f-9bad-4898-82c7-203ddf385577\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.604993 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-dns-svc\") pod \"23a7dc3f-9bad-4898-82c7-203ddf385577\" (UID: \"23a7dc3f-9bad-4898-82c7-203ddf385577\") " Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.628774 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a7dc3f-9bad-4898-82c7-203ddf385577-kube-api-access-dm6kz" (OuterVolumeSpecName: "kube-api-access-dm6kz") pod "23a7dc3f-9bad-4898-82c7-203ddf385577" (UID: "23a7dc3f-9bad-4898-82c7-203ddf385577"). InnerVolumeSpecName "kube-api-access-dm6kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.649662 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "23a7dc3f-9bad-4898-82c7-203ddf385577" (UID: "23a7dc3f-9bad-4898-82c7-203ddf385577"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.650622 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "23a7dc3f-9bad-4898-82c7-203ddf385577" (UID: "23a7dc3f-9bad-4898-82c7-203ddf385577"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.684461 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "23a7dc3f-9bad-4898-82c7-203ddf385577" (UID: "23a7dc3f-9bad-4898-82c7-203ddf385577"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.693987 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-config" (OuterVolumeSpecName: "config") pod "23a7dc3f-9bad-4898-82c7-203ddf385577" (UID: "23a7dc3f-9bad-4898-82c7-203ddf385577"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.706776 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.706812 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.706823 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.706833 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm6kz\" (UniqueName: \"kubernetes.io/projected/23a7dc3f-9bad-4898-82c7-203ddf385577-kube-api-access-dm6kz\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.706850 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23a7dc3f-9bad-4898-82c7-203ddf385577-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.829440 4903 generic.go:334] "Generic (PLEG): container finished" podID="bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f" containerID="daa45dc24f3e4bf07a60054cf5038e0a35924ee589ba20dd9f4b1795657490b2" exitCode=0 Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.829512 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hwngf" event={"ID":"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f","Type":"ContainerDied","Data":"daa45dc24f3e4bf07a60054cf5038e0a35924ee589ba20dd9f4b1795657490b2"} Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.832834 4903 generic.go:334] "Generic (PLEG): container finished" podID="23a7dc3f-9bad-4898-82c7-203ddf385577" containerID="754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964" exitCode=0 Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.833116 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" event={"ID":"23a7dc3f-9bad-4898-82c7-203ddf385577","Type":"ContainerDied","Data":"754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964"} Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.833290 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" event={"ID":"23a7dc3f-9bad-4898-82c7-203ddf385577","Type":"ContainerDied","Data":"15d075991e4d13689954070fd2191a6eb2bac25bebb42fd4c64cadc550e6ce1c"} Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.835389 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-8ckgn" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.841207 4903 scope.go:117] "RemoveContainer" containerID="754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.843080 4903 generic.go:334] "Generic (PLEG): container finished" podID="54015026-e605-4781-a02e-57b47c61284e" containerID="565e3db7a9d288f5296940d2760d8a4d808442910047f2d8fa967bf716192bb5" exitCode=0 Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.843181 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-m4hwk" event={"ID":"54015026-e605-4781-a02e-57b47c61284e","Type":"ContainerDied","Data":"565e3db7a9d288f5296940d2760d8a4d808442910047f2d8fa967bf716192bb5"} Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.843268 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-m4hwk" event={"ID":"54015026-e605-4781-a02e-57b47c61284e","Type":"ContainerStarted","Data":"51f1dc8b660c583f67ea384a16a813e13289c928ddee9bd93128f476f906e3d0"} Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.845247 4903 generic.go:334] "Generic (PLEG): container finished" podID="e03f215f-5dde-4e62-82c4-9ab840f6223a" containerID="81bf6290e0a3600cb73e207f93b56e418e5fa5670d3a82deb255b3f2895becbc" exitCode=0 Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.845335 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-08ee-account-create-update-s27bz" event={"ID":"e03f215f-5dde-4e62-82c4-9ab840f6223a","Type":"ContainerDied","Data":"81bf6290e0a3600cb73e207f93b56e418e5fa5670d3a82deb255b3f2895becbc"} Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.846883 4903 generic.go:334] "Generic (PLEG): container finished" podID="01f22901-e7ea-44d3-bc58-7efffaa493ad" containerID="390d83024e4e69985e2610318a454c1ce59f0eb179355db078862b76953ea0d2" exitCode=0 Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.847339 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7068-account-create-update-65l92" event={"ID":"01f22901-e7ea-44d3-bc58-7efffaa493ad","Type":"ContainerDied","Data":"390d83024e4e69985e2610318a454c1ce59f0eb179355db078862b76953ea0d2"} Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.875956 4903 scope.go:117] "RemoveContainer" containerID="072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.931508 4903 scope.go:117] "RemoveContainer" containerID="754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964" Mar 20 08:44:13 crc kubenswrapper[4903]: E0320 08:44:13.939759 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964\": container with ID starting with 754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964 not found: ID does not exist" containerID="754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.939829 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964"} err="failed to get container status \"754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964\": rpc error: code = NotFound desc = could not find container \"754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964\": container with ID starting with 754433d8761ac2e850ef6af2154918200f4ea08267f1a9c7fa6c9a0b4a9d2964 not found: ID does not exist" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.939856 4903 scope.go:117] "RemoveContainer" containerID="072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a" Mar 20 08:44:13 crc kubenswrapper[4903]: E0320 08:44:13.940547 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a\": container with ID starting with 072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a not found: ID does not exist" containerID="072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.940573 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a"} err="failed to get container status \"072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a\": rpc error: code = NotFound desc = could not find container \"072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a\": container with ID starting with 072f58a54007231ea14f47acf4fdefbf061cf954c7be68256801c41077a3671a not found: ID does not exist" Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.950768 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8ckgn"] Mar 20 08:44:13 crc kubenswrapper[4903]: I0320 08:44:13.957777 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-8ckgn"] Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.798484 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9mg76"] Mar 20 08:44:14 crc kubenswrapper[4903]: E0320 08:44:14.799570 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a7dc3f-9bad-4898-82c7-203ddf385577" containerName="init" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.799596 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a7dc3f-9bad-4898-82c7-203ddf385577" containerName="init" Mar 20 08:44:14 crc kubenswrapper[4903]: E0320 08:44:14.799650 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a7dc3f-9bad-4898-82c7-203ddf385577" containerName="dnsmasq-dns" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.799660 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a7dc3f-9bad-4898-82c7-203ddf385577" containerName="dnsmasq-dns" Mar 20 08:44:14 crc kubenswrapper[4903]: E0320 08:44:14.799686 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2137443-2f65-4c2b-87d2-8886f55a7ce7" containerName="mariadb-account-create-update" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.799695 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2137443-2f65-4c2b-87d2-8886f55a7ce7" containerName="mariadb-account-create-update" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.799925 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a7dc3f-9bad-4898-82c7-203ddf385577" containerName="dnsmasq-dns" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.799950 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2137443-2f65-4c2b-87d2-8886f55a7ce7" containerName="mariadb-account-create-update" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.803628 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9mg76" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.813946 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9mg76"] Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.932739 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwz75\" (UniqueName: \"kubernetes.io/projected/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-kube-api-access-kwz75\") pod \"glance-db-create-9mg76\" (UID: \"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1\") " pod="openstack/glance-db-create-9mg76" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.932840 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-operator-scripts\") pod \"glance-db-create-9mg76\" (UID: \"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1\") " pod="openstack/glance-db-create-9mg76" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.935691 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d2f7-account-create-update-9w2np"] Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.937865 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d2f7-account-create-update-9w2np" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.941796 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 08:44:14 crc kubenswrapper[4903]: I0320 08:44:14.945178 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d2f7-account-create-update-9w2np"] Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.034082 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68afaaf3-f790-4d52-9f29-49870d1950a5-operator-scripts\") pod \"glance-d2f7-account-create-update-9w2np\" (UID: \"68afaaf3-f790-4d52-9f29-49870d1950a5\") " pod="openstack/glance-d2f7-account-create-update-9w2np" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.034153 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-operator-scripts\") pod \"glance-db-create-9mg76\" (UID: \"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1\") " pod="openstack/glance-db-create-9mg76" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.034260 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5g78\" (UniqueName: \"kubernetes.io/projected/68afaaf3-f790-4d52-9f29-49870d1950a5-kube-api-access-p5g78\") pod \"glance-d2f7-account-create-update-9w2np\" (UID: \"68afaaf3-f790-4d52-9f29-49870d1950a5\") " pod="openstack/glance-d2f7-account-create-update-9w2np" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.034302 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwz75\" (UniqueName: \"kubernetes.io/projected/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-kube-api-access-kwz75\") pod \"glance-db-create-9mg76\" (UID: \"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1\") " pod="openstack/glance-db-create-9mg76" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.035192 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-operator-scripts\") pod \"glance-db-create-9mg76\" (UID: \"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1\") " pod="openstack/glance-db-create-9mg76" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.058112 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwz75\" (UniqueName: \"kubernetes.io/projected/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-kube-api-access-kwz75\") pod \"glance-db-create-9mg76\" (UID: \"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1\") " pod="openstack/glance-db-create-9mg76" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.144686 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9mg76" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.147185 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68afaaf3-f790-4d52-9f29-49870d1950a5-operator-scripts\") pod \"glance-d2f7-account-create-update-9w2np\" (UID: \"68afaaf3-f790-4d52-9f29-49870d1950a5\") " pod="openstack/glance-d2f7-account-create-update-9w2np" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.147356 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5g78\" (UniqueName: \"kubernetes.io/projected/68afaaf3-f790-4d52-9f29-49870d1950a5-kube-api-access-p5g78\") pod \"glance-d2f7-account-create-update-9w2np\" (UID: \"68afaaf3-f790-4d52-9f29-49870d1950a5\") " pod="openstack/glance-d2f7-account-create-update-9w2np" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.148433 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68afaaf3-f790-4d52-9f29-49870d1950a5-operator-scripts\") pod \"glance-d2f7-account-create-update-9w2np\" (UID: \"68afaaf3-f790-4d52-9f29-49870d1950a5\") " pod="openstack/glance-d2f7-account-create-update-9w2np" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.169744 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5g78\" (UniqueName: \"kubernetes.io/projected/68afaaf3-f790-4d52-9f29-49870d1950a5-kube-api-access-p5g78\") pod \"glance-d2f7-account-create-update-9w2np\" (UID: \"68afaaf3-f790-4d52-9f29-49870d1950a5\") " pod="openstack/glance-d2f7-account-create-update-9w2np" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.260579 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d2f7-account-create-update-9w2np" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.342242 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-m4hwk" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.443152 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08ee-account-create-update-s27bz" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.457374 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hwngf" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.459511 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54015026-e605-4781-a02e-57b47c61284e-operator-scripts\") pod \"54015026-e605-4781-a02e-57b47c61284e\" (UID: \"54015026-e605-4781-a02e-57b47c61284e\") " Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.459949 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcrvd\" (UniqueName: \"kubernetes.io/projected/54015026-e605-4781-a02e-57b47c61284e-kube-api-access-jcrvd\") pod \"54015026-e605-4781-a02e-57b47c61284e\" (UID: \"54015026-e605-4781-a02e-57b47c61284e\") " Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.462581 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54015026-e605-4781-a02e-57b47c61284e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54015026-e605-4781-a02e-57b47c61284e" (UID: "54015026-e605-4781-a02e-57b47c61284e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.470197 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7068-account-create-update-65l92" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.472402 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54015026-e605-4781-a02e-57b47c61284e-kube-api-access-jcrvd" (OuterVolumeSpecName: "kube-api-access-jcrvd") pod "54015026-e605-4781-a02e-57b47c61284e" (UID: "54015026-e605-4781-a02e-57b47c61284e"). InnerVolumeSpecName "kube-api-access-jcrvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.507202 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a7dc3f-9bad-4898-82c7-203ddf385577" path="/var/lib/kubelet/pods/23a7dc3f-9bad-4898-82c7-203ddf385577/volumes" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.562182 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e03f215f-5dde-4e62-82c4-9ab840f6223a-operator-scripts\") pod \"e03f215f-5dde-4e62-82c4-9ab840f6223a\" (UID: \"e03f215f-5dde-4e62-82c4-9ab840f6223a\") " Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.562263 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-operator-scripts\") pod \"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f\" (UID: \"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f\") " Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.562426 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7vjs\" (UniqueName: \"kubernetes.io/projected/e03f215f-5dde-4e62-82c4-9ab840f6223a-kube-api-access-v7vjs\") pod \"e03f215f-5dde-4e62-82c4-9ab840f6223a\" (UID: \"e03f215f-5dde-4e62-82c4-9ab840f6223a\") " Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.562480 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx955\" (UniqueName: \"kubernetes.io/projected/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-kube-api-access-sx955\") pod \"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f\" (UID: \"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f\") " Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.562893 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e03f215f-5dde-4e62-82c4-9ab840f6223a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e03f215f-5dde-4e62-82c4-9ab840f6223a" (UID: "e03f215f-5dde-4e62-82c4-9ab840f6223a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.563551 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f" (UID: "bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.564245 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcrvd\" (UniqueName: \"kubernetes.io/projected/54015026-e605-4781-a02e-57b47c61284e-kube-api-access-jcrvd\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.564291 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54015026-e605-4781-a02e-57b47c61284e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.564301 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e03f215f-5dde-4e62-82c4-9ab840f6223a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.564310 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.570387 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03f215f-5dde-4e62-82c4-9ab840f6223a-kube-api-access-v7vjs" (OuterVolumeSpecName: "kube-api-access-v7vjs") pod "e03f215f-5dde-4e62-82c4-9ab840f6223a" (UID: "e03f215f-5dde-4e62-82c4-9ab840f6223a"). InnerVolumeSpecName "kube-api-access-v7vjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.570680 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-kube-api-access-sx955" (OuterVolumeSpecName: "kube-api-access-sx955") pod "bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f" (UID: "bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f"). InnerVolumeSpecName "kube-api-access-sx955". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.665526 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px2s5\" (UniqueName: \"kubernetes.io/projected/01f22901-e7ea-44d3-bc58-7efffaa493ad-kube-api-access-px2s5\") pod \"01f22901-e7ea-44d3-bc58-7efffaa493ad\" (UID: \"01f22901-e7ea-44d3-bc58-7efffaa493ad\") " Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.665809 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f22901-e7ea-44d3-bc58-7efffaa493ad-operator-scripts\") pod \"01f22901-e7ea-44d3-bc58-7efffaa493ad\" (UID: \"01f22901-e7ea-44d3-bc58-7efffaa493ad\") " Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.666535 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7vjs\" (UniqueName: \"kubernetes.io/projected/e03f215f-5dde-4e62-82c4-9ab840f6223a-kube-api-access-v7vjs\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.666565 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx955\" (UniqueName: \"kubernetes.io/projected/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f-kube-api-access-sx955\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.667560 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f22901-e7ea-44d3-bc58-7efffaa493ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01f22901-e7ea-44d3-bc58-7efffaa493ad" (UID: "01f22901-e7ea-44d3-bc58-7efffaa493ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.668896 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f22901-e7ea-44d3-bc58-7efffaa493ad-kube-api-access-px2s5" (OuterVolumeSpecName: "kube-api-access-px2s5") pod "01f22901-e7ea-44d3-bc58-7efffaa493ad" (UID: "01f22901-e7ea-44d3-bc58-7efffaa493ad"). InnerVolumeSpecName "kube-api-access-px2s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:15 crc kubenswrapper[4903]: W0320 08:44:15.745199 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe39352c_cfbc_4c65_8ad2_0e51b78bbec1.slice/crio-98e9ef56d5bb70d040b20ee891af20bf13777d6a918506ba8cc1b2d6bd2753e4 WatchSource:0}: Error finding container 98e9ef56d5bb70d040b20ee891af20bf13777d6a918506ba8cc1b2d6bd2753e4: Status 404 returned error can't find the container with id 98e9ef56d5bb70d040b20ee891af20bf13777d6a918506ba8cc1b2d6bd2753e4 Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.748005 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9mg76"] Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.771781 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px2s5\" (UniqueName: \"kubernetes.io/projected/01f22901-e7ea-44d3-bc58-7efffaa493ad-kube-api-access-px2s5\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.771819 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f22901-e7ea-44d3-bc58-7efffaa493ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.837311 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d2f7-account-create-update-9w2np"] Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.885976 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7068-account-create-update-65l92" event={"ID":"01f22901-e7ea-44d3-bc58-7efffaa493ad","Type":"ContainerDied","Data":"d753e11e88a2af238c584ed57ebc698bf74a37d6f714bdbc164ef8ad7a0f9d4a"} Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.886006 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7068-account-create-update-65l92" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.886038 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d753e11e88a2af238c584ed57ebc698bf74a37d6f714bdbc164ef8ad7a0f9d4a" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.887833 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hwngf" event={"ID":"bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f","Type":"ContainerDied","Data":"ae827904357a766d5fdbeb3c908621e302a5995d621d744b50b3cf40844e125d"} Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.887881 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae827904357a766d5fdbeb3c908621e302a5995d621d744b50b3cf40844e125d" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.887848 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hwngf" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.889722 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-m4hwk" event={"ID":"54015026-e605-4781-a02e-57b47c61284e","Type":"ContainerDied","Data":"51f1dc8b660c583f67ea384a16a813e13289c928ddee9bd93128f476f906e3d0"} Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.889767 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51f1dc8b660c583f67ea384a16a813e13289c928ddee9bd93128f476f906e3d0" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.889816 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-m4hwk" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.890722 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9mg76" event={"ID":"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1","Type":"ContainerStarted","Data":"98e9ef56d5bb70d040b20ee891af20bf13777d6a918506ba8cc1b2d6bd2753e4"} Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.892012 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-08ee-account-create-update-s27bz" event={"ID":"e03f215f-5dde-4e62-82c4-9ab840f6223a","Type":"ContainerDied","Data":"5a08145552271baeac5f2379fb9f1e2cda2261b8b0700f8393a2619afb08299e"} Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.892051 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a08145552271baeac5f2379fb9f1e2cda2261b8b0700f8393a2619afb08299e" Mar 20 08:44:15 crc kubenswrapper[4903]: I0320 08:44:15.892093 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-08ee-account-create-update-s27bz" Mar 20 08:44:15 crc kubenswrapper[4903]: W0320 08:44:15.982504 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68afaaf3_f790_4d52_9f29_49870d1950a5.slice/crio-dbec06c13591ccf307d8038ea0e4c3d55c4dc41cf1179dedad0c91841f5890a7 WatchSource:0}: Error finding container dbec06c13591ccf307d8038ea0e4c3d55c4dc41cf1179dedad0c91841f5890a7: Status 404 returned error can't find the container with id dbec06c13591ccf307d8038ea0e4c3d55c4dc41cf1179dedad0c91841f5890a7 Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.517328 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-r7v7q"] Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.530018 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-r7v7q"] Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.588682 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ngcnd"] Mar 20 08:44:16 crc kubenswrapper[4903]: E0320 08:44:16.589226 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f22901-e7ea-44d3-bc58-7efffaa493ad" containerName="mariadb-account-create-update" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.589246 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f22901-e7ea-44d3-bc58-7efffaa493ad" containerName="mariadb-account-create-update" Mar 20 08:44:16 crc kubenswrapper[4903]: E0320 08:44:16.589283 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54015026-e605-4781-a02e-57b47c61284e" containerName="mariadb-database-create" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.589290 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="54015026-e605-4781-a02e-57b47c61284e" containerName="mariadb-database-create" Mar 20 08:44:16 crc kubenswrapper[4903]: E0320 08:44:16.589305 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03f215f-5dde-4e62-82c4-9ab840f6223a" containerName="mariadb-account-create-update" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.589313 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03f215f-5dde-4e62-82c4-9ab840f6223a" containerName="mariadb-account-create-update" Mar 20 08:44:16 crc kubenswrapper[4903]: E0320 08:44:16.589322 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f" containerName="mariadb-database-create" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.589328 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f" containerName="mariadb-database-create" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.589517 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f22901-e7ea-44d3-bc58-7efffaa493ad" containerName="mariadb-account-create-update" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.589540 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f" containerName="mariadb-database-create" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.589552 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03f215f-5dde-4e62-82c4-9ab840f6223a" containerName="mariadb-account-create-update" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.589565 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="54015026-e605-4781-a02e-57b47c61284e" containerName="mariadb-database-create" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.590282 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ngcnd" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.594183 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.604147 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ngcnd"] Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.690502 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/975f9731-683e-4d5e-be2a-e1f824c38513-operator-scripts\") pod \"root-account-create-update-ngcnd\" (UID: \"975f9731-683e-4d5e-be2a-e1f824c38513\") " pod="openstack/root-account-create-update-ngcnd" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.690606 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7jn5\" (UniqueName: \"kubernetes.io/projected/975f9731-683e-4d5e-be2a-e1f824c38513-kube-api-access-r7jn5\") pod \"root-account-create-update-ngcnd\" (UID: \"975f9731-683e-4d5e-be2a-e1f824c38513\") " pod="openstack/root-account-create-update-ngcnd" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.792728 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/975f9731-683e-4d5e-be2a-e1f824c38513-operator-scripts\") pod \"root-account-create-update-ngcnd\" (UID: \"975f9731-683e-4d5e-be2a-e1f824c38513\") " pod="openstack/root-account-create-update-ngcnd" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.792962 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7jn5\" (UniqueName: \"kubernetes.io/projected/975f9731-683e-4d5e-be2a-e1f824c38513-kube-api-access-r7jn5\") pod \"root-account-create-update-ngcnd\" (UID: \"975f9731-683e-4d5e-be2a-e1f824c38513\") " pod="openstack/root-account-create-update-ngcnd" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.794365 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/975f9731-683e-4d5e-be2a-e1f824c38513-operator-scripts\") pod \"root-account-create-update-ngcnd\" (UID: \"975f9731-683e-4d5e-be2a-e1f824c38513\") " pod="openstack/root-account-create-update-ngcnd" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.831344 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7jn5\" (UniqueName: \"kubernetes.io/projected/975f9731-683e-4d5e-be2a-e1f824c38513-kube-api-access-r7jn5\") pod \"root-account-create-update-ngcnd\" (UID: \"975f9731-683e-4d5e-be2a-e1f824c38513\") " pod="openstack/root-account-create-update-ngcnd" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.905079 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9mg76" event={"ID":"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1","Type":"ContainerDied","Data":"315928000ce5771f2b1094d24598a1282273c9d543fd01f4d6e5739b56bc3bab"} Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.904796 4903 generic.go:334] "Generic (PLEG): container finished" podID="fe39352c-cfbc-4c65-8ad2-0e51b78bbec1" containerID="315928000ce5771f2b1094d24598a1282273c9d543fd01f4d6e5739b56bc3bab" exitCode=0 Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.908950 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ngcnd" Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.909201 4903 generic.go:334] "Generic (PLEG): container finished" podID="68afaaf3-f790-4d52-9f29-49870d1950a5" containerID="2be0f124027cc325dc4a812c2195ae34353c0f3552687b7f56a5dd08cd7ffdbd" exitCode=0 Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.909268 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d2f7-account-create-update-9w2np" event={"ID":"68afaaf3-f790-4d52-9f29-49870d1950a5","Type":"ContainerDied","Data":"2be0f124027cc325dc4a812c2195ae34353c0f3552687b7f56a5dd08cd7ffdbd"} Mar 20 08:44:16 crc kubenswrapper[4903]: I0320 08:44:16.909321 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d2f7-account-create-update-9w2np" event={"ID":"68afaaf3-f790-4d52-9f29-49870d1950a5","Type":"ContainerStarted","Data":"dbec06c13591ccf307d8038ea0e4c3d55c4dc41cf1179dedad0c91841f5890a7"} Mar 20 08:44:17 crc kubenswrapper[4903]: W0320 08:44:17.467227 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod975f9731_683e_4d5e_be2a_e1f824c38513.slice/crio-e9f8e7f197310439bd84d05ad820e12d560cd02db8e98ba364bba628269bffe9 WatchSource:0}: Error finding container e9f8e7f197310439bd84d05ad820e12d560cd02db8e98ba364bba628269bffe9: Status 404 returned error can't find the container with id e9f8e7f197310439bd84d05ad820e12d560cd02db8e98ba364bba628269bffe9 Mar 20 08:44:17 crc kubenswrapper[4903]: I0320 08:44:17.474095 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ngcnd"] Mar 20 08:44:17 crc kubenswrapper[4903]: I0320 08:44:17.506691 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2137443-2f65-4c2b-87d2-8886f55a7ce7" path="/var/lib/kubelet/pods/c2137443-2f65-4c2b-87d2-8886f55a7ce7/volumes" Mar 20 08:44:17 crc kubenswrapper[4903]: I0320 08:44:17.598586 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 08:44:17 crc kubenswrapper[4903]: I0320 08:44:17.917927 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ngcnd" event={"ID":"975f9731-683e-4d5e-be2a-e1f824c38513","Type":"ContainerStarted","Data":"2a90a2925a40aac67b4421843577ba8b3be4d2b390467377735fe14ae6a6c32f"} Mar 20 08:44:17 crc kubenswrapper[4903]: I0320 08:44:17.918011 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ngcnd" event={"ID":"975f9731-683e-4d5e-be2a-e1f824c38513","Type":"ContainerStarted","Data":"e9f8e7f197310439bd84d05ad820e12d560cd02db8e98ba364bba628269bffe9"} Mar 20 08:44:17 crc kubenswrapper[4903]: I0320 08:44:17.955385 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-ngcnd" podStartSLOduration=1.955367384 podStartE2EDuration="1.955367384s" podCreationTimestamp="2026-03-20 08:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:44:17.952491164 +0000 UTC m=+1283.169391489" watchObservedRunningTime="2026-03-20 08:44:17.955367384 +0000 UTC m=+1283.172267699" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.314384 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9mg76" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.374160 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d2f7-account-create-update-9w2np" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.442889 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwz75\" (UniqueName: \"kubernetes.io/projected/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-kube-api-access-kwz75\") pod \"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1\" (UID: \"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1\") " Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.443291 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5g78\" (UniqueName: \"kubernetes.io/projected/68afaaf3-f790-4d52-9f29-49870d1950a5-kube-api-access-p5g78\") pod \"68afaaf3-f790-4d52-9f29-49870d1950a5\" (UID: \"68afaaf3-f790-4d52-9f29-49870d1950a5\") " Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.443430 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-operator-scripts\") pod \"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1\" (UID: \"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1\") " Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.443483 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68afaaf3-f790-4d52-9f29-49870d1950a5-operator-scripts\") pod \"68afaaf3-f790-4d52-9f29-49870d1950a5\" (UID: \"68afaaf3-f790-4d52-9f29-49870d1950a5\") " Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.444723 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe39352c-cfbc-4c65-8ad2-0e51b78bbec1" (UID: "fe39352c-cfbc-4c65-8ad2-0e51b78bbec1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.445217 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68afaaf3-f790-4d52-9f29-49870d1950a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68afaaf3-f790-4d52-9f29-49870d1950a5" (UID: "68afaaf3-f790-4d52-9f29-49870d1950a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.452407 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68afaaf3-f790-4d52-9f29-49870d1950a5-kube-api-access-p5g78" (OuterVolumeSpecName: "kube-api-access-p5g78") pod "68afaaf3-f790-4d52-9f29-49870d1950a5" (UID: "68afaaf3-f790-4d52-9f29-49870d1950a5"). InnerVolumeSpecName "kube-api-access-p5g78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.468507 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-kube-api-access-kwz75" (OuterVolumeSpecName: "kube-api-access-kwz75") pod "fe39352c-cfbc-4c65-8ad2-0e51b78bbec1" (UID: "fe39352c-cfbc-4c65-8ad2-0e51b78bbec1"). InnerVolumeSpecName "kube-api-access-kwz75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.546169 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.546212 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68afaaf3-f790-4d52-9f29-49870d1950a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.546225 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwz75\" (UniqueName: \"kubernetes.io/projected/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1-kube-api-access-kwz75\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.546238 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5g78\" (UniqueName: \"kubernetes.io/projected/68afaaf3-f790-4d52-9f29-49870d1950a5-kube-api-access-p5g78\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.931468 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9mg76" event={"ID":"fe39352c-cfbc-4c65-8ad2-0e51b78bbec1","Type":"ContainerDied","Data":"98e9ef56d5bb70d040b20ee891af20bf13777d6a918506ba8cc1b2d6bd2753e4"} Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.931559 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98e9ef56d5bb70d040b20ee891af20bf13777d6a918506ba8cc1b2d6bd2753e4" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.931510 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9mg76" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.933489 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d2f7-account-create-update-9w2np" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.933509 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d2f7-account-create-update-9w2np" event={"ID":"68afaaf3-f790-4d52-9f29-49870d1950a5","Type":"ContainerDied","Data":"dbec06c13591ccf307d8038ea0e4c3d55c4dc41cf1179dedad0c91841f5890a7"} Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.933562 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbec06c13591ccf307d8038ea0e4c3d55c4dc41cf1179dedad0c91841f5890a7" Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.935489 4903 generic.go:334] "Generic (PLEG): container finished" podID="975f9731-683e-4d5e-be2a-e1f824c38513" containerID="2a90a2925a40aac67b4421843577ba8b3be4d2b390467377735fe14ae6a6c32f" exitCode=0 Mar 20 08:44:18 crc kubenswrapper[4903]: I0320 08:44:18.935542 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ngcnd" event={"ID":"975f9731-683e-4d5e-be2a-e1f824c38513","Type":"ContainerDied","Data":"2a90a2925a40aac67b4421843577ba8b3be4d2b390467377735fe14ae6a6c32f"} Mar 20 08:44:19 crc kubenswrapper[4903]: I0320 08:44:19.258993 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:19 crc kubenswrapper[4903]: E0320 08:44:19.259224 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:44:19 crc kubenswrapper[4903]: E0320 08:44:19.259604 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 08:44:19 crc kubenswrapper[4903]: E0320 08:44:19.259697 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:44:35.259660018 +0000 UTC m=+1300.476560353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : configmap "swift-ring-files" not found Mar 20 08:44:19 crc kubenswrapper[4903]: I0320 08:44:19.949689 4903 generic.go:334] "Generic (PLEG): container finished" podID="1642398c-5346-421f-86c2-baec0001304e" containerID="ae7ef07f6710e3c50df8fdfb347b54fac262a0797dc427c28c37c346ed91d089" exitCode=0 Mar 20 08:44:19 crc kubenswrapper[4903]: I0320 08:44:19.950240 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25k6h" event={"ID":"1642398c-5346-421f-86c2-baec0001304e","Type":"ContainerDied","Data":"ae7ef07f6710e3c50df8fdfb347b54fac262a0797dc427c28c37c346ed91d089"} Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.068534 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-z5s8l"] Mar 20 08:44:20 crc kubenswrapper[4903]: E0320 08:44:20.068939 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68afaaf3-f790-4d52-9f29-49870d1950a5" containerName="mariadb-account-create-update" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.068955 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="68afaaf3-f790-4d52-9f29-49870d1950a5" containerName="mariadb-account-create-update" Mar 20 08:44:20 crc kubenswrapper[4903]: E0320 08:44:20.068971 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe39352c-cfbc-4c65-8ad2-0e51b78bbec1" containerName="mariadb-database-create" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.068979 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe39352c-cfbc-4c65-8ad2-0e51b78bbec1" containerName="mariadb-database-create" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.069226 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="68afaaf3-f790-4d52-9f29-49870d1950a5" containerName="mariadb-account-create-update" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.069247 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe39352c-cfbc-4c65-8ad2-0e51b78bbec1" containerName="mariadb-database-create" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.069941 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.076982 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.077273 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4d2kn" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.087156 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-z5s8l"] Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.096219 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx948\" (UniqueName: \"kubernetes.io/projected/83d25fdc-23fe-48a2-855e-5a907ad53d68-kube-api-access-nx948\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.096279 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-config-data\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.096341 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-db-sync-config-data\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.096369 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-combined-ca-bundle\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.199135 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-combined-ca-bundle\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.199329 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx948\" (UniqueName: \"kubernetes.io/projected/83d25fdc-23fe-48a2-855e-5a907ad53d68-kube-api-access-nx948\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.199359 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-config-data\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.199419 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-db-sync-config-data\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.207020 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-db-sync-config-data\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.207102 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-config-data\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.208136 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-combined-ca-bundle\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.220420 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx948\" (UniqueName: \"kubernetes.io/projected/83d25fdc-23fe-48a2-855e-5a907ad53d68-kube-api-access-nx948\") pod \"glance-db-sync-z5s8l\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.435841 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ngcnd" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.455947 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.504656 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/975f9731-683e-4d5e-be2a-e1f824c38513-operator-scripts\") pod \"975f9731-683e-4d5e-be2a-e1f824c38513\" (UID: \"975f9731-683e-4d5e-be2a-e1f824c38513\") " Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.504850 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7jn5\" (UniqueName: \"kubernetes.io/projected/975f9731-683e-4d5e-be2a-e1f824c38513-kube-api-access-r7jn5\") pod \"975f9731-683e-4d5e-be2a-e1f824c38513\" (UID: \"975f9731-683e-4d5e-be2a-e1f824c38513\") " Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.506406 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/975f9731-683e-4d5e-be2a-e1f824c38513-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "975f9731-683e-4d5e-be2a-e1f824c38513" (UID: "975f9731-683e-4d5e-be2a-e1f824c38513"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:20 crc kubenswrapper[4903]: I0320 08:44:20.519648 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975f9731-683e-4d5e-be2a-e1f824c38513-kube-api-access-r7jn5" (OuterVolumeSpecName: "kube-api-access-r7jn5") pod "975f9731-683e-4d5e-be2a-e1f824c38513" (UID: "975f9731-683e-4d5e-be2a-e1f824c38513"). InnerVolumeSpecName "kube-api-access-r7jn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:20.607311 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/975f9731-683e-4d5e-be2a-e1f824c38513-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:20.607348 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7jn5\" (UniqueName: \"kubernetes.io/projected/975f9731-683e-4d5e-be2a-e1f824c38513-kube-api-access-r7jn5\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:20.962164 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ngcnd" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:20.963018 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ngcnd" event={"ID":"975f9731-683e-4d5e-be2a-e1f824c38513","Type":"ContainerDied","Data":"e9f8e7f197310439bd84d05ad820e12d560cd02db8e98ba364bba628269bffe9"} Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:20.963067 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9f8e7f197310439bd84d05ad820e12d560cd02db8e98ba364bba628269bffe9" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.594923 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.735348 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-scripts\") pod \"1642398c-5346-421f-86c2-baec0001304e\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.735967 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-swiftconf\") pod \"1642398c-5346-421f-86c2-baec0001304e\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.736111 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-dispersionconf\") pod \"1642398c-5346-421f-86c2-baec0001304e\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.736179 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1642398c-5346-421f-86c2-baec0001304e-etc-swift\") pod \"1642398c-5346-421f-86c2-baec0001304e\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.737285 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1642398c-5346-421f-86c2-baec0001304e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1642398c-5346-421f-86c2-baec0001304e" (UID: "1642398c-5346-421f-86c2-baec0001304e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.737456 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-ring-data-devices\") pod \"1642398c-5346-421f-86c2-baec0001304e\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.738677 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1642398c-5346-421f-86c2-baec0001304e" (UID: "1642398c-5346-421f-86c2-baec0001304e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.738748 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlqw5\" (UniqueName: \"kubernetes.io/projected/1642398c-5346-421f-86c2-baec0001304e-kube-api-access-nlqw5\") pod \"1642398c-5346-421f-86c2-baec0001304e\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.738814 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-combined-ca-bundle\") pod \"1642398c-5346-421f-86c2-baec0001304e\" (UID: \"1642398c-5346-421f-86c2-baec0001304e\") " Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.739777 4903 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1642398c-5346-421f-86c2-baec0001304e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.739804 4903 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.742683 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1642398c-5346-421f-86c2-baec0001304e-kube-api-access-nlqw5" (OuterVolumeSpecName: "kube-api-access-nlqw5") pod "1642398c-5346-421f-86c2-baec0001304e" (UID: "1642398c-5346-421f-86c2-baec0001304e"). InnerVolumeSpecName "kube-api-access-nlqw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.757801 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1642398c-5346-421f-86c2-baec0001304e" (UID: "1642398c-5346-421f-86c2-baec0001304e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.760164 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-scripts" (OuterVolumeSpecName: "scripts") pod "1642398c-5346-421f-86c2-baec0001304e" (UID: "1642398c-5346-421f-86c2-baec0001304e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.784581 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1642398c-5346-421f-86c2-baec0001304e" (UID: "1642398c-5346-421f-86c2-baec0001304e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.792482 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1642398c-5346-421f-86c2-baec0001304e" (UID: "1642398c-5346-421f-86c2-baec0001304e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.814797 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-z5s8l"] Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.842481 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlqw5\" (UniqueName: \"kubernetes.io/projected/1642398c-5346-421f-86c2-baec0001304e-kube-api-access-nlqw5\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.842688 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.842773 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1642398c-5346-421f-86c2-baec0001304e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.842822 4903 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.842870 4903 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1642398c-5346-421f-86c2-baec0001304e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.970468 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z5s8l" event={"ID":"83d25fdc-23fe-48a2-855e-5a907ad53d68","Type":"ContainerStarted","Data":"7016f30867579e25dee1a9eb8a6acdf777f78f07dfbf13fc69d6429ad9073755"} Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.972579 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-25k6h" event={"ID":"1642398c-5346-421f-86c2-baec0001304e","Type":"ContainerDied","Data":"ff71b3c6c64e1195b9ccc14c1d44fb626cf59538ad3fe49ad52cb508f5e726cd"} Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.972609 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff71b3c6c64e1195b9ccc14c1d44fb626cf59538ad3fe49ad52cb508f5e726cd" Mar 20 08:44:21 crc kubenswrapper[4903]: I0320 08:44:21.972821 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-25k6h" Mar 20 08:44:22 crc kubenswrapper[4903]: I0320 08:44:22.051638 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wdtrn" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerName="ovn-controller" probeResult="failure" output=< Mar 20 08:44:22 crc kubenswrapper[4903]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 08:44:22 crc kubenswrapper[4903]: > Mar 20 08:44:22 crc kubenswrapper[4903]: I0320 08:44:22.107144 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.045641 4903 generic.go:334] "Generic (PLEG): container finished" podID="df937948-08c4-447c-9450-07221ce76552" containerID="fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9" exitCode=0 Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.045807 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"df937948-08c4-447c-9450-07221ce76552","Type":"ContainerDied","Data":"fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9"} Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.072991 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wdtrn" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerName="ovn-controller" probeResult="failure" output=< Mar 20 08:44:27 crc kubenswrapper[4903]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 08:44:27 crc kubenswrapper[4903]: > Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.118512 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.342099 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wdtrn-config-259s9"] Mar 20 08:44:27 crc kubenswrapper[4903]: E0320 08:44:27.342576 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1642398c-5346-421f-86c2-baec0001304e" containerName="swift-ring-rebalance" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.342591 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1642398c-5346-421f-86c2-baec0001304e" containerName="swift-ring-rebalance" Mar 20 08:44:27 crc kubenswrapper[4903]: E0320 08:44:27.342625 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975f9731-683e-4d5e-be2a-e1f824c38513" containerName="mariadb-account-create-update" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.342633 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="975f9731-683e-4d5e-be2a-e1f824c38513" containerName="mariadb-account-create-update" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.342837 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="975f9731-683e-4d5e-be2a-e1f824c38513" containerName="mariadb-account-create-update" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.342854 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1642398c-5346-421f-86c2-baec0001304e" containerName="swift-ring-rebalance" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.343530 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.346609 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.367087 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wdtrn-config-259s9"] Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.457958 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-additional-scripts\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.458277 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-scripts\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.458533 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run-ovn\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.458778 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.458830 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhr6s\" (UniqueName: \"kubernetes.io/projected/3355bb2f-bbe7-496c-a6f7-34bc5863032f-kube-api-access-zhr6s\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.458878 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-log-ovn\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.560631 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run-ovn\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.560725 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.560746 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhr6s\" (UniqueName: \"kubernetes.io/projected/3355bb2f-bbe7-496c-a6f7-34bc5863032f-kube-api-access-zhr6s\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.560765 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-log-ovn\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.560875 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-additional-scripts\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.560961 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-scripts\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.562667 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run-ovn\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.562684 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-log-ovn\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.562762 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.563539 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-additional-scripts\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.564539 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-scripts\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.587679 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhr6s\" (UniqueName: \"kubernetes.io/projected/3355bb2f-bbe7-496c-a6f7-34bc5863032f-kube-api-access-zhr6s\") pod \"ovn-controller-wdtrn-config-259s9\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:27 crc kubenswrapper[4903]: I0320 08:44:27.677008 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:28 crc kubenswrapper[4903]: I0320 08:44:28.062203 4903 generic.go:334] "Generic (PLEG): container finished" podID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" containerID="79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31" exitCode=0 Mar 20 08:44:28 crc kubenswrapper[4903]: I0320 08:44:28.062329 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509","Type":"ContainerDied","Data":"79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31"} Mar 20 08:44:32 crc kubenswrapper[4903]: I0320 08:44:32.061855 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wdtrn" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerName="ovn-controller" probeResult="failure" output=< Mar 20 08:44:32 crc kubenswrapper[4903]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 08:44:32 crc kubenswrapper[4903]: > Mar 20 08:44:35 crc kubenswrapper[4903]: I0320 08:44:35.308481 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:35 crc kubenswrapper[4903]: I0320 08:44:35.320225 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") pod \"swift-storage-0\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " pod="openstack/swift-storage-0" Mar 20 08:44:35 crc kubenswrapper[4903]: I0320 08:44:35.533363 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 08:44:36 crc kubenswrapper[4903]: I0320 08:44:36.631524 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wdtrn-config-259s9"] Mar 20 08:44:36 crc kubenswrapper[4903]: W0320 08:44:36.673783 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3355bb2f_bbe7_496c_a6f7_34bc5863032f.slice/crio-6fa6befc7dd67db33a633abb44c3aae709516ba392cc7d281565f6f3bff7867a WatchSource:0}: Error finding container 6fa6befc7dd67db33a633abb44c3aae709516ba392cc7d281565f6f3bff7867a: Status 404 returned error can't find the container with id 6fa6befc7dd67db33a633abb44c3aae709516ba392cc7d281565f6f3bff7867a Mar 20 08:44:36 crc kubenswrapper[4903]: I0320 08:44:36.873939 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 08:44:36 crc kubenswrapper[4903]: W0320 08:44:36.881335 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccedd84e_d0d0_40b8_812c_3a57b41aee98.slice/crio-4cad09a2e774c6f24fe9e4fb211cec9f3ae73443f7f2b6604592174ed6c9ff3d WatchSource:0}: Error finding container 4cad09a2e774c6f24fe9e4fb211cec9f3ae73443f7f2b6604592174ed6c9ff3d: Status 404 returned error can't find the container with id 4cad09a2e774c6f24fe9e4fb211cec9f3ae73443f7f2b6604592174ed6c9ff3d Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.050383 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wdtrn" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerName="ovn-controller" probeResult="failure" output=< Mar 20 08:44:37 crc kubenswrapper[4903]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 08:44:37 crc kubenswrapper[4903]: > Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.170007 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdtrn-config-259s9" event={"ID":"3355bb2f-bbe7-496c-a6f7-34bc5863032f","Type":"ContainerStarted","Data":"4b25264630ca383860adbcdd6a815dbc27ba77766601fe09e1c25af53f7d43d9"} Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.170449 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdtrn-config-259s9" event={"ID":"3355bb2f-bbe7-496c-a6f7-34bc5863032f","Type":"ContainerStarted","Data":"6fa6befc7dd67db33a633abb44c3aae709516ba392cc7d281565f6f3bff7867a"} Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.172528 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509","Type":"ContainerStarted","Data":"bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04"} Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.172718 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.174517 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"4cad09a2e774c6f24fe9e4fb211cec9f3ae73443f7f2b6604592174ed6c9ff3d"} Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.176771 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"df937948-08c4-447c-9450-07221ce76552","Type":"ContainerStarted","Data":"80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804"} Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.176954 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.178700 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z5s8l" event={"ID":"83d25fdc-23fe-48a2-855e-5a907ad53d68","Type":"ContainerStarted","Data":"5579e258c969d007185aad27798c30763e8bd565e14091159bc558d34757c14e"} Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.197864 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wdtrn-config-259s9" podStartSLOduration=10.197840283 podStartE2EDuration="10.197840283s" podCreationTimestamp="2026-03-20 08:44:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:44:37.19192117 +0000 UTC m=+1302.408821485" watchObservedRunningTime="2026-03-20 08:44:37.197840283 +0000 UTC m=+1302.414740598" Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.211239 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-z5s8l" podStartSLOduration=2.797615881 podStartE2EDuration="17.211221398s" podCreationTimestamp="2026-03-20 08:44:20 +0000 UTC" firstStartedPulling="2026-03-20 08:44:21.817695871 +0000 UTC m=+1287.034596196" lastFinishedPulling="2026-03-20 08:44:36.231301388 +0000 UTC m=+1301.448201713" observedRunningTime="2026-03-20 08:44:37.208104691 +0000 UTC m=+1302.425005006" watchObservedRunningTime="2026-03-20 08:44:37.211221398 +0000 UTC m=+1302.428121713" Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.239780 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=46.772805588 podStartE2EDuration="1m23.239760829s" podCreationTimestamp="2026-03-20 08:43:14 +0000 UTC" firstStartedPulling="2026-03-20 08:43:16.390551401 +0000 UTC m=+1221.607451716" lastFinishedPulling="2026-03-20 08:43:52.857506652 +0000 UTC m=+1258.074406957" observedRunningTime="2026-03-20 08:44:37.23195235 +0000 UTC m=+1302.448852665" watchObservedRunningTime="2026-03-20 08:44:37.239760829 +0000 UTC m=+1302.456661144" Mar 20 08:44:37 crc kubenswrapper[4903]: I0320 08:44:37.263555 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=46.50601533 podStartE2EDuration="1m22.263531856s" podCreationTimestamp="2026-03-20 08:43:15 +0000 UTC" firstStartedPulling="2026-03-20 08:43:17.044276555 +0000 UTC m=+1222.261176870" lastFinishedPulling="2026-03-20 08:43:52.801793091 +0000 UTC m=+1258.018693396" observedRunningTime="2026-03-20 08:44:37.259946098 +0000 UTC m=+1302.476846413" watchObservedRunningTime="2026-03-20 08:44:37.263531856 +0000 UTC m=+1302.480432181" Mar 20 08:44:38 crc kubenswrapper[4903]: I0320 08:44:38.188292 4903 generic.go:334] "Generic (PLEG): container finished" podID="3355bb2f-bbe7-496c-a6f7-34bc5863032f" containerID="4b25264630ca383860adbcdd6a815dbc27ba77766601fe09e1c25af53f7d43d9" exitCode=0 Mar 20 08:44:38 crc kubenswrapper[4903]: I0320 08:44:38.190180 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdtrn-config-259s9" event={"ID":"3355bb2f-bbe7-496c-a6f7-34bc5863032f","Type":"ContainerDied","Data":"4b25264630ca383860adbcdd6a815dbc27ba77766601fe09e1c25af53f7d43d9"} Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.205785 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45"} Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.206167 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88"} Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.508659 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.612446 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-log-ovn\") pod \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.612556 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3355bb2f-bbe7-496c-a6f7-34bc5863032f" (UID: "3355bb2f-bbe7-496c-a6f7-34bc5863032f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.612611 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run\") pod \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.612634 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run" (OuterVolumeSpecName: "var-run") pod "3355bb2f-bbe7-496c-a6f7-34bc5863032f" (UID: "3355bb2f-bbe7-496c-a6f7-34bc5863032f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.612676 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhr6s\" (UniqueName: \"kubernetes.io/projected/3355bb2f-bbe7-496c-a6f7-34bc5863032f-kube-api-access-zhr6s\") pod \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.612805 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-additional-scripts\") pod \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.612993 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-scripts\") pod \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.613021 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run-ovn\") pod \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\" (UID: \"3355bb2f-bbe7-496c-a6f7-34bc5863032f\") " Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.613270 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3355bb2f-bbe7-496c-a6f7-34bc5863032f" (UID: "3355bb2f-bbe7-496c-a6f7-34bc5863032f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.613976 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-scripts" (OuterVolumeSpecName: "scripts") pod "3355bb2f-bbe7-496c-a6f7-34bc5863032f" (UID: "3355bb2f-bbe7-496c-a6f7-34bc5863032f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.614013 4903 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.614052 4903 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.614066 4903 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3355bb2f-bbe7-496c-a6f7-34bc5863032f-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.615505 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3355bb2f-bbe7-496c-a6f7-34bc5863032f" (UID: "3355bb2f-bbe7-496c-a6f7-34bc5863032f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.620275 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3355bb2f-bbe7-496c-a6f7-34bc5863032f-kube-api-access-zhr6s" (OuterVolumeSpecName: "kube-api-access-zhr6s") pod "3355bb2f-bbe7-496c-a6f7-34bc5863032f" (UID: "3355bb2f-bbe7-496c-a6f7-34bc5863032f"). InnerVolumeSpecName "kube-api-access-zhr6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.716318 4903 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.716365 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3355bb2f-bbe7-496c-a6f7-34bc5863032f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.716378 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhr6s\" (UniqueName: \"kubernetes.io/projected/3355bb2f-bbe7-496c-a6f7-34bc5863032f-kube-api-access-zhr6s\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.764808 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wdtrn-config-259s9"] Mar 20 08:44:39 crc kubenswrapper[4903]: I0320 08:44:39.777165 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wdtrn-config-259s9"] Mar 20 08:44:40 crc kubenswrapper[4903]: I0320 08:44:40.217582 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7"} Mar 20 08:44:40 crc kubenswrapper[4903]: I0320 08:44:40.217649 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2"} Mar 20 08:44:40 crc kubenswrapper[4903]: I0320 08:44:40.219533 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa6befc7dd67db33a633abb44c3aae709516ba392cc7d281565f6f3bff7867a" Mar 20 08:44:40 crc kubenswrapper[4903]: I0320 08:44:40.219605 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdtrn-config-259s9" Mar 20 08:44:41 crc kubenswrapper[4903]: I0320 08:44:41.231998 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4"} Mar 20 08:44:41 crc kubenswrapper[4903]: I0320 08:44:41.232619 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e"} Mar 20 08:44:41 crc kubenswrapper[4903]: I0320 08:44:41.499291 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3355bb2f-bbe7-496c-a6f7-34bc5863032f" path="/var/lib/kubelet/pods/3355bb2f-bbe7-496c-a6f7-34bc5863032f/volumes" Mar 20 08:44:42 crc kubenswrapper[4903]: I0320 08:44:42.051951 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wdtrn" Mar 20 08:44:42 crc kubenswrapper[4903]: I0320 08:44:42.250782 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8"} Mar 20 08:44:42 crc kubenswrapper[4903]: I0320 08:44:42.251941 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c"} Mar 20 08:44:43 crc kubenswrapper[4903]: I0320 08:44:43.262245 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da"} Mar 20 08:44:43 crc kubenswrapper[4903]: I0320 08:44:43.262628 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186"} Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.280581 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b"} Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.280634 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622"} Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.280644 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5"} Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.280654 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103"} Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.280663 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerStarted","Data":"ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda"} Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.334144 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.572215316 podStartE2EDuration="42.334106821s" podCreationTimestamp="2026-03-20 08:44:02 +0000 UTC" firstStartedPulling="2026-03-20 08:44:36.885005608 +0000 UTC m=+1302.101905933" lastFinishedPulling="2026-03-20 08:44:42.646897123 +0000 UTC m=+1307.863797438" observedRunningTime="2026-03-20 08:44:44.321719071 +0000 UTC m=+1309.538619386" watchObservedRunningTime="2026-03-20 08:44:44.334106821 +0000 UTC m=+1309.551007196" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.624023 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-9mmcd"] Mar 20 08:44:44 crc kubenswrapper[4903]: E0320 08:44:44.624408 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3355bb2f-bbe7-496c-a6f7-34bc5863032f" containerName="ovn-config" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.624420 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3355bb2f-bbe7-496c-a6f7-34bc5863032f" containerName="ovn-config" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.624569 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3355bb2f-bbe7-496c-a6f7-34bc5863032f" containerName="ovn-config" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.625410 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.628540 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.640365 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-9mmcd"] Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.701439 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.701508 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.701585 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.701622 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-config\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.701676 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-svc\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.701717 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2frr7\" (UniqueName: \"kubernetes.io/projected/d17a4a25-bf86-4f45-8821-88f7c94c7e90-kube-api-access-2frr7\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.803718 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.803797 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.803863 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.803900 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-config\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.803953 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-svc\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.803993 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2frr7\" (UniqueName: \"kubernetes.io/projected/d17a4a25-bf86-4f45-8821-88f7c94c7e90-kube-api-access-2frr7\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.804743 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.804856 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.805202 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.805379 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-config\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.805461 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-svc\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.824072 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2frr7\" (UniqueName: \"kubernetes.io/projected/d17a4a25-bf86-4f45-8821-88f7c94c7e90-kube-api-access-2frr7\") pod \"dnsmasq-dns-764c5664d7-9mmcd\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:44 crc kubenswrapper[4903]: I0320 08:44:44.961770 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:45 crc kubenswrapper[4903]: I0320 08:44:45.293751 4903 generic.go:334] "Generic (PLEG): container finished" podID="83d25fdc-23fe-48a2-855e-5a907ad53d68" containerID="5579e258c969d007185aad27798c30763e8bd565e14091159bc558d34757c14e" exitCode=0 Mar 20 08:44:45 crc kubenswrapper[4903]: I0320 08:44:45.294092 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z5s8l" event={"ID":"83d25fdc-23fe-48a2-855e-5a907ad53d68","Type":"ContainerDied","Data":"5579e258c969d007185aad27798c30763e8bd565e14091159bc558d34757c14e"} Mar 20 08:44:45 crc kubenswrapper[4903]: I0320 08:44:45.386431 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-9mmcd"] Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.305481 4903 generic.go:334] "Generic (PLEG): container finished" podID="d17a4a25-bf86-4f45-8821-88f7c94c7e90" containerID="656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d" exitCode=0 Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.305574 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" event={"ID":"d17a4a25-bf86-4f45-8821-88f7c94c7e90","Type":"ContainerDied","Data":"656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d"} Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.305958 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" event={"ID":"d17a4a25-bf86-4f45-8821-88f7c94c7e90","Type":"ContainerStarted","Data":"affa85ebf5e5ccff345d34aaa216ca269c4a2f939aecff086df5173c8f0f4263"} Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.476373 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="df937948-08c4-447c-9450-07221ce76552" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.742270 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.855681 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-db-sync-config-data\") pod \"83d25fdc-23fe-48a2-855e-5a907ad53d68\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.855745 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-combined-ca-bundle\") pod \"83d25fdc-23fe-48a2-855e-5a907ad53d68\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.855886 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx948\" (UniqueName: \"kubernetes.io/projected/83d25fdc-23fe-48a2-855e-5a907ad53d68-kube-api-access-nx948\") pod \"83d25fdc-23fe-48a2-855e-5a907ad53d68\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.855938 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-config-data\") pod \"83d25fdc-23fe-48a2-855e-5a907ad53d68\" (UID: \"83d25fdc-23fe-48a2-855e-5a907ad53d68\") " Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.861008 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83d25fdc-23fe-48a2-855e-5a907ad53d68-kube-api-access-nx948" (OuterVolumeSpecName: "kube-api-access-nx948") pod "83d25fdc-23fe-48a2-855e-5a907ad53d68" (UID: "83d25fdc-23fe-48a2-855e-5a907ad53d68"). InnerVolumeSpecName "kube-api-access-nx948". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.863391 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "83d25fdc-23fe-48a2-855e-5a907ad53d68" (UID: "83d25fdc-23fe-48a2-855e-5a907ad53d68"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.884445 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83d25fdc-23fe-48a2-855e-5a907ad53d68" (UID: "83d25fdc-23fe-48a2-855e-5a907ad53d68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.906993 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-config-data" (OuterVolumeSpecName: "config-data") pod "83d25fdc-23fe-48a2-855e-5a907ad53d68" (UID: "83d25fdc-23fe-48a2-855e-5a907ad53d68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.957640 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx948\" (UniqueName: \"kubernetes.io/projected/83d25fdc-23fe-48a2-855e-5a907ad53d68-kube-api-access-nx948\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.957680 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.957693 4903 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:46 crc kubenswrapper[4903]: I0320 08:44:46.957702 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d25fdc-23fe-48a2-855e-5a907ad53d68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.317898 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z5s8l" event={"ID":"83d25fdc-23fe-48a2-855e-5a907ad53d68","Type":"ContainerDied","Data":"7016f30867579e25dee1a9eb8a6acdf777f78f07dfbf13fc69d6429ad9073755"} Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.317941 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z5s8l" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.317966 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7016f30867579e25dee1a9eb8a6acdf777f78f07dfbf13fc69d6429ad9073755" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.322986 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" event={"ID":"d17a4a25-bf86-4f45-8821-88f7c94c7e90","Type":"ContainerStarted","Data":"75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622"} Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.323211 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.359383 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" podStartSLOduration=3.359360825 podStartE2EDuration="3.359360825s" podCreationTimestamp="2026-03-20 08:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:44:47.353240286 +0000 UTC m=+1312.570140601" watchObservedRunningTime="2026-03-20 08:44:47.359360825 +0000 UTC m=+1312.576261140" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.798449 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-9mmcd"] Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.816685 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mk5xs"] Mar 20 08:44:47 crc kubenswrapper[4903]: E0320 08:44:47.817054 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d25fdc-23fe-48a2-855e-5a907ad53d68" containerName="glance-db-sync" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.817070 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d25fdc-23fe-48a2-855e-5a907ad53d68" containerName="glance-db-sync" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.817232 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d25fdc-23fe-48a2-855e-5a907ad53d68" containerName="glance-db-sync" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.818071 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.838970 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mk5xs"] Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.979249 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.979430 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-config\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.979912 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x6xd\" (UniqueName: \"kubernetes.io/projected/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-kube-api-access-7x6xd\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.980001 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.980063 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:47 crc kubenswrapper[4903]: I0320 08:44:47.980120 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.081680 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.081752 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-config\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.081842 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x6xd\" (UniqueName: \"kubernetes.io/projected/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-kube-api-access-7x6xd\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.081871 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.081891 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.081906 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.083515 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.083543 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.083579 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.083624 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.083731 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-config\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.124097 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x6xd\" (UniqueName: \"kubernetes.io/projected/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-kube-api-access-7x6xd\") pod \"dnsmasq-dns-74f6bcbc87-mk5xs\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.134261 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:48 crc kubenswrapper[4903]: I0320 08:44:48.704708 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mk5xs"] Mar 20 08:44:48 crc kubenswrapper[4903]: W0320 08:44:48.713609 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1a5fcc4_d8ea_4cb8_9169_3e428787ab1d.slice/crio-0e25dcbb5560c8452824d28bcdb1fadb982de060c5ab358d882ebe5bb0ebfe4b WatchSource:0}: Error finding container 0e25dcbb5560c8452824d28bcdb1fadb982de060c5ab358d882ebe5bb0ebfe4b: Status 404 returned error can't find the container with id 0e25dcbb5560c8452824d28bcdb1fadb982de060c5ab358d882ebe5bb0ebfe4b Mar 20 08:44:49 crc kubenswrapper[4903]: I0320 08:44:49.354541 4903 generic.go:334] "Generic (PLEG): container finished" podID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" containerID="5a912e439af80db8d0d67d630a1b39b6180efc8e1a2e0ef701900df05688fdaf" exitCode=0 Mar 20 08:44:49 crc kubenswrapper[4903]: I0320 08:44:49.355259 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" podUID="d17a4a25-bf86-4f45-8821-88f7c94c7e90" containerName="dnsmasq-dns" containerID="cri-o://75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622" gracePeriod=10 Mar 20 08:44:49 crc kubenswrapper[4903]: I0320 08:44:49.355248 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" event={"ID":"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d","Type":"ContainerDied","Data":"5a912e439af80db8d0d67d630a1b39b6180efc8e1a2e0ef701900df05688fdaf"} Mar 20 08:44:49 crc kubenswrapper[4903]: I0320 08:44:49.355352 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" event={"ID":"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d","Type":"ContainerStarted","Data":"0e25dcbb5560c8452824d28bcdb1fadb982de060c5ab358d882ebe5bb0ebfe4b"} Mar 20 08:44:49 crc kubenswrapper[4903]: I0320 08:44:49.872609 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.029153 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-sb\") pod \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.029230 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-nb\") pod \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.029318 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-config\") pod \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.029358 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-swift-storage-0\") pod \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.029395 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-svc\") pod \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.029490 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2frr7\" (UniqueName: \"kubernetes.io/projected/d17a4a25-bf86-4f45-8821-88f7c94c7e90-kube-api-access-2frr7\") pod \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\" (UID: \"d17a4a25-bf86-4f45-8821-88f7c94c7e90\") " Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.034735 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17a4a25-bf86-4f45-8821-88f7c94c7e90-kube-api-access-2frr7" (OuterVolumeSpecName: "kube-api-access-2frr7") pod "d17a4a25-bf86-4f45-8821-88f7c94c7e90" (UID: "d17a4a25-bf86-4f45-8821-88f7c94c7e90"). InnerVolumeSpecName "kube-api-access-2frr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.084414 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d17a4a25-bf86-4f45-8821-88f7c94c7e90" (UID: "d17a4a25-bf86-4f45-8821-88f7c94c7e90"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.085121 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-config" (OuterVolumeSpecName: "config") pod "d17a4a25-bf86-4f45-8821-88f7c94c7e90" (UID: "d17a4a25-bf86-4f45-8821-88f7c94c7e90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.087015 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d17a4a25-bf86-4f45-8821-88f7c94c7e90" (UID: "d17a4a25-bf86-4f45-8821-88f7c94c7e90"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.120099 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d17a4a25-bf86-4f45-8821-88f7c94c7e90" (UID: "d17a4a25-bf86-4f45-8821-88f7c94c7e90"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.123694 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d17a4a25-bf86-4f45-8821-88f7c94c7e90" (UID: "d17a4a25-bf86-4f45-8821-88f7c94c7e90"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.132065 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2frr7\" (UniqueName: \"kubernetes.io/projected/d17a4a25-bf86-4f45-8821-88f7c94c7e90-kube-api-access-2frr7\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.132130 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.132140 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.132152 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.132163 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.132172 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d17a4a25-bf86-4f45-8821-88f7c94c7e90-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.369094 4903 generic.go:334] "Generic (PLEG): container finished" podID="d17a4a25-bf86-4f45-8821-88f7c94c7e90" containerID="75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622" exitCode=0 Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.369598 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" event={"ID":"d17a4a25-bf86-4f45-8821-88f7c94c7e90","Type":"ContainerDied","Data":"75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622"} Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.369643 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" event={"ID":"d17a4a25-bf86-4f45-8821-88f7c94c7e90","Type":"ContainerDied","Data":"affa85ebf5e5ccff345d34aaa216ca269c4a2f939aecff086df5173c8f0f4263"} Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.369675 4903 scope.go:117] "RemoveContainer" containerID="75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.369812 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-9mmcd" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.376233 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" event={"ID":"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d","Type":"ContainerStarted","Data":"e79c33bcfd08b17843d391b6564c49280f4384378deac752dca59ae3cdecb9a5"} Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.377541 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.402404 4903 scope.go:117] "RemoveContainer" containerID="656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.432431 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" podStartSLOduration=3.432405254 podStartE2EDuration="3.432405254s" podCreationTimestamp="2026-03-20 08:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:44:50.412966103 +0000 UTC m=+1315.629866418" watchObservedRunningTime="2026-03-20 08:44:50.432405254 +0000 UTC m=+1315.649305579" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.447626 4903 scope.go:117] "RemoveContainer" containerID="75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622" Mar 20 08:44:50 crc kubenswrapper[4903]: E0320 08:44:50.448374 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622\": container with ID starting with 75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622 not found: ID does not exist" containerID="75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.448429 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622"} err="failed to get container status \"75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622\": rpc error: code = NotFound desc = could not find container \"75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622\": container with ID starting with 75e06399d4f792048838cbbfc7f4cd76b83ce6ba40188024bbc92e6836d48622 not found: ID does not exist" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.448470 4903 scope.go:117] "RemoveContainer" containerID="656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d" Mar 20 08:44:50 crc kubenswrapper[4903]: E0320 08:44:50.449091 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d\": container with ID starting with 656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d not found: ID does not exist" containerID="656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.449118 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d"} err="failed to get container status \"656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d\": rpc error: code = NotFound desc = could not find container \"656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d\": container with ID starting with 656176a1199429793fa17409665830105d532b625a2b68b50cac8f0f2a40952d not found: ID does not exist" Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.459711 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-9mmcd"] Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.472351 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-9mmcd"] Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.833868 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:44:50 crc kubenswrapper[4903]: I0320 08:44:50.833990 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:44:51 crc kubenswrapper[4903]: I0320 08:44:51.505006 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17a4a25-bf86-4f45-8821-88f7c94c7e90" path="/var/lib/kubelet/pods/d17a4a25-bf86-4f45-8821-88f7c94c7e90/volumes" Mar 20 08:44:55 crc kubenswrapper[4903]: I0320 08:44:55.706412 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.074213 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qq45l"] Mar 20 08:44:56 crc kubenswrapper[4903]: E0320 08:44:56.074734 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17a4a25-bf86-4f45-8821-88f7c94c7e90" containerName="dnsmasq-dns" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.074751 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17a4a25-bf86-4f45-8821-88f7c94c7e90" containerName="dnsmasq-dns" Mar 20 08:44:56 crc kubenswrapper[4903]: E0320 08:44:56.074765 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17a4a25-bf86-4f45-8821-88f7c94c7e90" containerName="init" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.074773 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17a4a25-bf86-4f45-8821-88f7c94c7e90" containerName="init" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.074931 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17a4a25-bf86-4f45-8821-88f7c94c7e90" containerName="dnsmasq-dns" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.075439 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qq45l" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.088172 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qq45l"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.152298 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8zjt\" (UniqueName: \"kubernetes.io/projected/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-kube-api-access-f8zjt\") pod \"cinder-db-create-qq45l\" (UID: \"b1b1baed-f9ed-4ec9-8dd0-adc4db771821\") " pod="openstack/cinder-db-create-qq45l" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.152653 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-operator-scripts\") pod \"cinder-db-create-qq45l\" (UID: \"b1b1baed-f9ed-4ec9-8dd0-adc4db771821\") " pod="openstack/cinder-db-create-qq45l" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.171959 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0a04-account-create-update-br8cg"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.173241 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a04-account-create-update-br8cg" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.180188 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.186385 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0a04-account-create-update-br8cg"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.255116 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ddb7780-fcec-42d1-811e-5cc8a4169917-operator-scripts\") pod \"cinder-0a04-account-create-update-br8cg\" (UID: \"0ddb7780-fcec-42d1-811e-5cc8a4169917\") " pod="openstack/cinder-0a04-account-create-update-br8cg" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.255216 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-operator-scripts\") pod \"cinder-db-create-qq45l\" (UID: \"b1b1baed-f9ed-4ec9-8dd0-adc4db771821\") " pod="openstack/cinder-db-create-qq45l" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.255272 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8zjt\" (UniqueName: \"kubernetes.io/projected/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-kube-api-access-f8zjt\") pod \"cinder-db-create-qq45l\" (UID: \"b1b1baed-f9ed-4ec9-8dd0-adc4db771821\") " pod="openstack/cinder-db-create-qq45l" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.255341 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4lj2\" (UniqueName: \"kubernetes.io/projected/0ddb7780-fcec-42d1-811e-5cc8a4169917-kube-api-access-x4lj2\") pod \"cinder-0a04-account-create-update-br8cg\" (UID: \"0ddb7780-fcec-42d1-811e-5cc8a4169917\") " pod="openstack/cinder-0a04-account-create-update-br8cg" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.256107 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-operator-scripts\") pod \"cinder-db-create-qq45l\" (UID: \"b1b1baed-f9ed-4ec9-8dd0-adc4db771821\") " pod="openstack/cinder-db-create-qq45l" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.266664 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-zxbh6"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.267928 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zxbh6" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.278907 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zxbh6"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.304861 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8zjt\" (UniqueName: \"kubernetes.io/projected/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-kube-api-access-f8zjt\") pod \"cinder-db-create-qq45l\" (UID: \"b1b1baed-f9ed-4ec9-8dd0-adc4db771821\") " pod="openstack/cinder-db-create-qq45l" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.357208 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc69384-efae-4c4f-be81-591b2cd17538-operator-scripts\") pod \"barbican-db-create-zxbh6\" (UID: \"7bc69384-efae-4c4f-be81-591b2cd17538\") " pod="openstack/barbican-db-create-zxbh6" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.357291 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lj2\" (UniqueName: \"kubernetes.io/projected/0ddb7780-fcec-42d1-811e-5cc8a4169917-kube-api-access-x4lj2\") pod \"cinder-0a04-account-create-update-br8cg\" (UID: \"0ddb7780-fcec-42d1-811e-5cc8a4169917\") " pod="openstack/cinder-0a04-account-create-update-br8cg" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.357519 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ddb7780-fcec-42d1-811e-5cc8a4169917-operator-scripts\") pod \"cinder-0a04-account-create-update-br8cg\" (UID: \"0ddb7780-fcec-42d1-811e-5cc8a4169917\") " pod="openstack/cinder-0a04-account-create-update-br8cg" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.357740 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljkr8\" (UniqueName: \"kubernetes.io/projected/7bc69384-efae-4c4f-be81-591b2cd17538-kube-api-access-ljkr8\") pod \"barbican-db-create-zxbh6\" (UID: \"7bc69384-efae-4c4f-be81-591b2cd17538\") " pod="openstack/barbican-db-create-zxbh6" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.358625 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ddb7780-fcec-42d1-811e-5cc8a4169917-operator-scripts\") pod \"cinder-0a04-account-create-update-br8cg\" (UID: \"0ddb7780-fcec-42d1-811e-5cc8a4169917\") " pod="openstack/cinder-0a04-account-create-update-br8cg" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.365121 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-8z25d"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.366325 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8z25d" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.379762 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8z25d"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.396173 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4e4e-account-create-update-5n652"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.397579 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e4e-account-create-update-5n652" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.406345 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.407730 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4lj2\" (UniqueName: \"kubernetes.io/projected/0ddb7780-fcec-42d1-811e-5cc8a4169917-kube-api-access-x4lj2\") pod \"cinder-0a04-account-create-update-br8cg\" (UID: \"0ddb7780-fcec-42d1-811e-5cc8a4169917\") " pod="openstack/cinder-0a04-account-create-update-br8cg" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.413538 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qq45l" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.426234 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4e4e-account-create-update-5n652"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.460059 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgxtg\" (UniqueName: \"kubernetes.io/projected/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-kube-api-access-tgxtg\") pod \"neutron-db-create-8z25d\" (UID: \"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07\") " pod="openstack/neutron-db-create-8z25d" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.460127 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv8z6\" (UniqueName: \"kubernetes.io/projected/8abfc827-bdd5-43e9-877c-c3d611fc463e-kube-api-access-hv8z6\") pod \"neutron-4e4e-account-create-update-5n652\" (UID: \"8abfc827-bdd5-43e9-877c-c3d611fc463e\") " pod="openstack/neutron-4e4e-account-create-update-5n652" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.460165 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljkr8\" (UniqueName: \"kubernetes.io/projected/7bc69384-efae-4c4f-be81-591b2cd17538-kube-api-access-ljkr8\") pod \"barbican-db-create-zxbh6\" (UID: \"7bc69384-efae-4c4f-be81-591b2cd17538\") " pod="openstack/barbican-db-create-zxbh6" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.460227 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8abfc827-bdd5-43e9-877c-c3d611fc463e-operator-scripts\") pod \"neutron-4e4e-account-create-update-5n652\" (UID: \"8abfc827-bdd5-43e9-877c-c3d611fc463e\") " pod="openstack/neutron-4e4e-account-create-update-5n652" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.460296 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc69384-efae-4c4f-be81-591b2cd17538-operator-scripts\") pod \"barbican-db-create-zxbh6\" (UID: \"7bc69384-efae-4c4f-be81-591b2cd17538\") " pod="openstack/barbican-db-create-zxbh6" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.460321 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-operator-scripts\") pod \"neutron-db-create-8z25d\" (UID: \"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07\") " pod="openstack/neutron-db-create-8z25d" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.461670 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc69384-efae-4c4f-be81-591b2cd17538-operator-scripts\") pod \"barbican-db-create-zxbh6\" (UID: \"7bc69384-efae-4c4f-be81-591b2cd17538\") " pod="openstack/barbican-db-create-zxbh6" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.482886 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.489120 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a04-account-create-update-br8cg" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.513401 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljkr8\" (UniqueName: \"kubernetes.io/projected/7bc69384-efae-4c4f-be81-591b2cd17538-kube-api-access-ljkr8\") pod \"barbican-db-create-zxbh6\" (UID: \"7bc69384-efae-4c4f-be81-591b2cd17538\") " pod="openstack/barbican-db-create-zxbh6" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.533551 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-35b8-account-create-update-pdgzs"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.534659 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-35b8-account-create-update-pdgzs" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.535956 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.560088 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-35b8-account-create-update-pdgzs"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.562170 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgxtg\" (UniqueName: \"kubernetes.io/projected/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-kube-api-access-tgxtg\") pod \"neutron-db-create-8z25d\" (UID: \"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07\") " pod="openstack/neutron-db-create-8z25d" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.562202 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv8z6\" (UniqueName: \"kubernetes.io/projected/8abfc827-bdd5-43e9-877c-c3d611fc463e-kube-api-access-hv8z6\") pod \"neutron-4e4e-account-create-update-5n652\" (UID: \"8abfc827-bdd5-43e9-877c-c3d611fc463e\") " pod="openstack/neutron-4e4e-account-create-update-5n652" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.562286 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8abfc827-bdd5-43e9-877c-c3d611fc463e-operator-scripts\") pod \"neutron-4e4e-account-create-update-5n652\" (UID: \"8abfc827-bdd5-43e9-877c-c3d611fc463e\") " pod="openstack/neutron-4e4e-account-create-update-5n652" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.562396 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-operator-scripts\") pod \"neutron-db-create-8z25d\" (UID: \"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07\") " pod="openstack/neutron-db-create-8z25d" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.563079 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-operator-scripts\") pod \"neutron-db-create-8z25d\" (UID: \"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07\") " pod="openstack/neutron-db-create-8z25d" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.567725 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8abfc827-bdd5-43e9-877c-c3d611fc463e-operator-scripts\") pod \"neutron-4e4e-account-create-update-5n652\" (UID: \"8abfc827-bdd5-43e9-877c-c3d611fc463e\") " pod="openstack/neutron-4e4e-account-create-update-5n652" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.586922 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv8z6\" (UniqueName: \"kubernetes.io/projected/8abfc827-bdd5-43e9-877c-c3d611fc463e-kube-api-access-hv8z6\") pod \"neutron-4e4e-account-create-update-5n652\" (UID: \"8abfc827-bdd5-43e9-877c-c3d611fc463e\") " pod="openstack/neutron-4e4e-account-create-update-5n652" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.591959 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgxtg\" (UniqueName: \"kubernetes.io/projected/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-kube-api-access-tgxtg\") pod \"neutron-db-create-8z25d\" (UID: \"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07\") " pod="openstack/neutron-db-create-8z25d" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.594847 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zxbh6" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.615722 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kt4gk"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.618487 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.620660 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.622273 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rzqbd" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.622486 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.622628 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.641632 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kt4gk"] Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.664978 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-combined-ca-bundle\") pod \"keystone-db-sync-kt4gk\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.665097 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-operator-scripts\") pod \"barbican-35b8-account-create-update-pdgzs\" (UID: \"6214cfc3-afe8-4e2c-aafe-d59d16b108b5\") " pod="openstack/barbican-35b8-account-create-update-pdgzs" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.665144 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-config-data\") pod \"keystone-db-sync-kt4gk\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.665462 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk5d8\" (UniqueName: \"kubernetes.io/projected/feeda47f-bf82-4e99-a704-b405a817bb1d-kube-api-access-nk5d8\") pod \"keystone-db-sync-kt4gk\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.665518 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcz2q\" (UniqueName: \"kubernetes.io/projected/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-kube-api-access-wcz2q\") pod \"barbican-35b8-account-create-update-pdgzs\" (UID: \"6214cfc3-afe8-4e2c-aafe-d59d16b108b5\") " pod="openstack/barbican-35b8-account-create-update-pdgzs" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.692442 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8z25d" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.767512 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-combined-ca-bundle\") pod \"keystone-db-sync-kt4gk\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.767560 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-operator-scripts\") pod \"barbican-35b8-account-create-update-pdgzs\" (UID: \"6214cfc3-afe8-4e2c-aafe-d59d16b108b5\") " pod="openstack/barbican-35b8-account-create-update-pdgzs" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.767592 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-config-data\") pod \"keystone-db-sync-kt4gk\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.767624 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk5d8\" (UniqueName: \"kubernetes.io/projected/feeda47f-bf82-4e99-a704-b405a817bb1d-kube-api-access-nk5d8\") pod \"keystone-db-sync-kt4gk\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.767656 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcz2q\" (UniqueName: \"kubernetes.io/projected/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-kube-api-access-wcz2q\") pod \"barbican-35b8-account-create-update-pdgzs\" (UID: \"6214cfc3-afe8-4e2c-aafe-d59d16b108b5\") " pod="openstack/barbican-35b8-account-create-update-pdgzs" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.769294 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-operator-scripts\") pod \"barbican-35b8-account-create-update-pdgzs\" (UID: \"6214cfc3-afe8-4e2c-aafe-d59d16b108b5\") " pod="openstack/barbican-35b8-account-create-update-pdgzs" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.774165 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-combined-ca-bundle\") pod \"keystone-db-sync-kt4gk\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.783687 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-config-data\") pod \"keystone-db-sync-kt4gk\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.788738 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk5d8\" (UniqueName: \"kubernetes.io/projected/feeda47f-bf82-4e99-a704-b405a817bb1d-kube-api-access-nk5d8\") pod \"keystone-db-sync-kt4gk\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.792156 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcz2q\" (UniqueName: \"kubernetes.io/projected/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-kube-api-access-wcz2q\") pod \"barbican-35b8-account-create-update-pdgzs\" (UID: \"6214cfc3-afe8-4e2c-aafe-d59d16b108b5\") " pod="openstack/barbican-35b8-account-create-update-pdgzs" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.798377 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e4e-account-create-update-5n652" Mar 20 08:44:56 crc kubenswrapper[4903]: I0320 08:44:56.929000 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-35b8-account-create-update-pdgzs" Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.000561 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.073239 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qq45l"] Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.161491 4903 scope.go:117] "RemoveContainer" containerID="3dc2d0cf6bffe11b6698132443c96584f1a291d336cd377ba64bf1b1c193b375" Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.189581 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0a04-account-create-update-br8cg"] Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.272790 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zxbh6"] Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.363882 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8z25d"] Mar 20 08:44:57 crc kubenswrapper[4903]: W0320 08:44:57.409828 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a1287a5_35f8_4d1a_8a4d_e7b30c957c07.slice/crio-fc76b13e1d70ac4b656efe57cc839d97622817c718b4375fbf5b8dbdeaf8d37d WatchSource:0}: Error finding container fc76b13e1d70ac4b656efe57cc839d97622817c718b4375fbf5b8dbdeaf8d37d: Status 404 returned error can't find the container with id fc76b13e1d70ac4b656efe57cc839d97622817c718b4375fbf5b8dbdeaf8d37d Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.441158 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4e4e-account-create-update-5n652"] Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.469862 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qq45l" event={"ID":"b1b1baed-f9ed-4ec9-8dd0-adc4db771821","Type":"ContainerStarted","Data":"eb3dbc6f45f101429c739d7777bc9d208e66f80da26203fea7b6eb31ff6fd490"} Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.469928 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qq45l" event={"ID":"b1b1baed-f9ed-4ec9-8dd0-adc4db771821","Type":"ContainerStarted","Data":"d4cca650711b31bea467f367fc2149975bc9d4deb53b1c41a0c62135a69a7f2a"} Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.474376 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8z25d" event={"ID":"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07","Type":"ContainerStarted","Data":"fc76b13e1d70ac4b656efe57cc839d97622817c718b4375fbf5b8dbdeaf8d37d"} Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.487911 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a04-account-create-update-br8cg" event={"ID":"0ddb7780-fcec-42d1-811e-5cc8a4169917","Type":"ContainerStarted","Data":"efb5a0ac8d805a42d55384cb31ea0df86d41974ed723b438d1b5bd9ac4dc5a33"} Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.505241 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zxbh6" event={"ID":"7bc69384-efae-4c4f-be81-591b2cd17538","Type":"ContainerStarted","Data":"0ebf152f7bb1da34751e6525c1bed6839c82b3f6472ca1fc7cb00fc8471b3f15"} Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.515048 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-qq45l" podStartSLOduration=1.5150057430000001 podStartE2EDuration="1.515005743s" podCreationTimestamp="2026-03-20 08:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:44:57.484481532 +0000 UTC m=+1322.701381857" watchObservedRunningTime="2026-03-20 08:44:57.515005743 +0000 UTC m=+1322.731906058" Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.524844 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-35b8-account-create-update-pdgzs"] Mar 20 08:44:57 crc kubenswrapper[4903]: I0320 08:44:57.597454 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kt4gk"] Mar 20 08:44:57 crc kubenswrapper[4903]: W0320 08:44:57.620485 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeeda47f_bf82_4e99_a704_b405a817bb1d.slice/crio-e0b87b0989dfe1d272e002dfdb9821facdcd560093625b1b6fb938d1b9f76288 WatchSource:0}: Error finding container e0b87b0989dfe1d272e002dfdb9821facdcd560093625b1b6fb938d1b9f76288: Status 404 returned error can't find the container with id e0b87b0989dfe1d272e002dfdb9821facdcd560093625b1b6fb938d1b9f76288 Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.136229 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.188818 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqltz"] Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.189284 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-gqltz" podUID="6de2d5fb-9f92-4a35-8264-48353a33895a" containerName="dnsmasq-dns" containerID="cri-o://61c80e20e2f5f5e7a08057fdd53cd07ee7d988b7e195a4b93be668970fa183e1" gracePeriod=10 Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.508064 4903 generic.go:334] "Generic (PLEG): container finished" podID="6de2d5fb-9f92-4a35-8264-48353a33895a" containerID="61c80e20e2f5f5e7a08057fdd53cd07ee7d988b7e195a4b93be668970fa183e1" exitCode=0 Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.508152 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqltz" event={"ID":"6de2d5fb-9f92-4a35-8264-48353a33895a","Type":"ContainerDied","Data":"61c80e20e2f5f5e7a08057fdd53cd07ee7d988b7e195a4b93be668970fa183e1"} Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.510217 4903 generic.go:334] "Generic (PLEG): container finished" podID="7bc69384-efae-4c4f-be81-591b2cd17538" containerID="17d52af7be15bf2eb59faf29a775496681d100d63125270b5888714485eae90c" exitCode=0 Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.510284 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zxbh6" event={"ID":"7bc69384-efae-4c4f-be81-591b2cd17538","Type":"ContainerDied","Data":"17d52af7be15bf2eb59faf29a775496681d100d63125270b5888714485eae90c"} Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.511789 4903 generic.go:334] "Generic (PLEG): container finished" podID="b1b1baed-f9ed-4ec9-8dd0-adc4db771821" containerID="eb3dbc6f45f101429c739d7777bc9d208e66f80da26203fea7b6eb31ff6fd490" exitCode=0 Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.511850 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qq45l" event={"ID":"b1b1baed-f9ed-4ec9-8dd0-adc4db771821","Type":"ContainerDied","Data":"eb3dbc6f45f101429c739d7777bc9d208e66f80da26203fea7b6eb31ff6fd490"} Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.513915 4903 generic.go:334] "Generic (PLEG): container finished" podID="0ddb7780-fcec-42d1-811e-5cc8a4169917" containerID="03a018cef030a0be0be44e08280bdfaf6515a35b4883ba9ce657fe82954c5842" exitCode=0 Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.513961 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a04-account-create-update-br8cg" event={"ID":"0ddb7780-fcec-42d1-811e-5cc8a4169917","Type":"ContainerDied","Data":"03a018cef030a0be0be44e08280bdfaf6515a35b4883ba9ce657fe82954c5842"} Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.514772 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kt4gk" event={"ID":"feeda47f-bf82-4e99-a704-b405a817bb1d","Type":"ContainerStarted","Data":"e0b87b0989dfe1d272e002dfdb9821facdcd560093625b1b6fb938d1b9f76288"} Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.516120 4903 generic.go:334] "Generic (PLEG): container finished" podID="8abfc827-bdd5-43e9-877c-c3d611fc463e" containerID="24c424c306f16903d97df8a740c0324e47598833553f2a3494098232c957427d" exitCode=0 Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.516155 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e4e-account-create-update-5n652" event={"ID":"8abfc827-bdd5-43e9-877c-c3d611fc463e","Type":"ContainerDied","Data":"24c424c306f16903d97df8a740c0324e47598833553f2a3494098232c957427d"} Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.516168 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e4e-account-create-update-5n652" event={"ID":"8abfc827-bdd5-43e9-877c-c3d611fc463e","Type":"ContainerStarted","Data":"076aec6a57cb3273e082b91b0df068e2c1841eda1e513d6187c54c4ac0ab4978"} Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.517342 4903 generic.go:334] "Generic (PLEG): container finished" podID="1a1287a5-35f8-4d1a-8a4d-e7b30c957c07" containerID="c63eea97f0c9e2d4513fd116c4d53a68a48e802327a50bbf881c9a689353d8fa" exitCode=0 Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.517442 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8z25d" event={"ID":"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07","Type":"ContainerDied","Data":"c63eea97f0c9e2d4513fd116c4d53a68a48e802327a50bbf881c9a689353d8fa"} Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.518580 4903 generic.go:334] "Generic (PLEG): container finished" podID="6214cfc3-afe8-4e2c-aafe-d59d16b108b5" containerID="2e4006812e1ded1b8c19b4cfe7112d6c217a64e73fe57851046d794a40a06e9c" exitCode=0 Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.518613 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-35b8-account-create-update-pdgzs" event={"ID":"6214cfc3-afe8-4e2c-aafe-d59d16b108b5","Type":"ContainerDied","Data":"2e4006812e1ded1b8c19b4cfe7112d6c217a64e73fe57851046d794a40a06e9c"} Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.518673 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-35b8-account-create-update-pdgzs" event={"ID":"6214cfc3-afe8-4e2c-aafe-d59d16b108b5","Type":"ContainerStarted","Data":"b284ad1995a656e0e9d65fff304f64af8888aed5b85bf9b3d389622685961e88"} Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.735620 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.814259 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-config\") pod \"6de2d5fb-9f92-4a35-8264-48353a33895a\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.814315 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f4q8\" (UniqueName: \"kubernetes.io/projected/6de2d5fb-9f92-4a35-8264-48353a33895a-kube-api-access-6f4q8\") pod \"6de2d5fb-9f92-4a35-8264-48353a33895a\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.814410 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-nb\") pod \"6de2d5fb-9f92-4a35-8264-48353a33895a\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.814441 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-sb\") pod \"6de2d5fb-9f92-4a35-8264-48353a33895a\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.814537 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-dns-svc\") pod \"6de2d5fb-9f92-4a35-8264-48353a33895a\" (UID: \"6de2d5fb-9f92-4a35-8264-48353a33895a\") " Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.822971 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de2d5fb-9f92-4a35-8264-48353a33895a-kube-api-access-6f4q8" (OuterVolumeSpecName: "kube-api-access-6f4q8") pod "6de2d5fb-9f92-4a35-8264-48353a33895a" (UID: "6de2d5fb-9f92-4a35-8264-48353a33895a"). InnerVolumeSpecName "kube-api-access-6f4q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.858173 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6de2d5fb-9f92-4a35-8264-48353a33895a" (UID: "6de2d5fb-9f92-4a35-8264-48353a33895a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.862458 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6de2d5fb-9f92-4a35-8264-48353a33895a" (UID: "6de2d5fb-9f92-4a35-8264-48353a33895a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.868199 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6de2d5fb-9f92-4a35-8264-48353a33895a" (UID: "6de2d5fb-9f92-4a35-8264-48353a33895a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.874082 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-config" (OuterVolumeSpecName: "config") pod "6de2d5fb-9f92-4a35-8264-48353a33895a" (UID: "6de2d5fb-9f92-4a35-8264-48353a33895a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.916498 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.916532 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.916546 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f4q8\" (UniqueName: \"kubernetes.io/projected/6de2d5fb-9f92-4a35-8264-48353a33895a-kube-api-access-6f4q8\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.916560 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:58 crc kubenswrapper[4903]: I0320 08:44:58.916571 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6de2d5fb-9f92-4a35-8264-48353a33895a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:44:59 crc kubenswrapper[4903]: I0320 08:44:59.531619 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gqltz" event={"ID":"6de2d5fb-9f92-4a35-8264-48353a33895a","Type":"ContainerDied","Data":"0c99ff6c0a12334d6ed71d2c74055d0ef8ff18f223b4e69bfcb7136ff560aeaa"} Mar 20 08:44:59 crc kubenswrapper[4903]: I0320 08:44:59.532141 4903 scope.go:117] "RemoveContainer" containerID="61c80e20e2f5f5e7a08057fdd53cd07ee7d988b7e195a4b93be668970fa183e1" Mar 20 08:44:59 crc kubenswrapper[4903]: I0320 08:44:59.531802 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gqltz" Mar 20 08:44:59 crc kubenswrapper[4903]: I0320 08:44:59.563706 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqltz"] Mar 20 08:44:59 crc kubenswrapper[4903]: I0320 08:44:59.571673 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gqltz"] Mar 20 08:44:59 crc kubenswrapper[4903]: I0320 08:44:59.596457 4903 scope.go:117] "RemoveContainer" containerID="ee22835e173d77a00221bda8da6f1d27eacb21f1253f3657c3ecf9a6739e4cb1" Mar 20 08:44:59 crc kubenswrapper[4903]: I0320 08:44:59.997199 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a04-account-create-update-br8cg" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.047180 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ddb7780-fcec-42d1-811e-5cc8a4169917-operator-scripts\") pod \"0ddb7780-fcec-42d1-811e-5cc8a4169917\" (UID: \"0ddb7780-fcec-42d1-811e-5cc8a4169917\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.047320 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4lj2\" (UniqueName: \"kubernetes.io/projected/0ddb7780-fcec-42d1-811e-5cc8a4169917-kube-api-access-x4lj2\") pod \"0ddb7780-fcec-42d1-811e-5cc8a4169917\" (UID: \"0ddb7780-fcec-42d1-811e-5cc8a4169917\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.047619 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ddb7780-fcec-42d1-811e-5cc8a4169917-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ddb7780-fcec-42d1-811e-5cc8a4169917" (UID: "0ddb7780-fcec-42d1-811e-5cc8a4169917"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.049437 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ddb7780-fcec-42d1-811e-5cc8a4169917-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.078975 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ddb7780-fcec-42d1-811e-5cc8a4169917-kube-api-access-x4lj2" (OuterVolumeSpecName: "kube-api-access-x4lj2") pod "0ddb7780-fcec-42d1-811e-5cc8a4169917" (UID: "0ddb7780-fcec-42d1-811e-5cc8a4169917"). InnerVolumeSpecName "kube-api-access-x4lj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.148264 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e4e-account-create-update-5n652" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.150941 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4lj2\" (UniqueName: \"kubernetes.io/projected/0ddb7780-fcec-42d1-811e-5cc8a4169917-kube-api-access-x4lj2\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.156233 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qq45l" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.165293 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96"] Mar 20 08:45:00 crc kubenswrapper[4903]: E0320 08:45:00.165878 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de2d5fb-9f92-4a35-8264-48353a33895a" containerName="dnsmasq-dns" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.165902 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de2d5fb-9f92-4a35-8264-48353a33895a" containerName="dnsmasq-dns" Mar 20 08:45:00 crc kubenswrapper[4903]: E0320 08:45:00.165913 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de2d5fb-9f92-4a35-8264-48353a33895a" containerName="init" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.165920 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de2d5fb-9f92-4a35-8264-48353a33895a" containerName="init" Mar 20 08:45:00 crc kubenswrapper[4903]: E0320 08:45:00.165942 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1b1baed-f9ed-4ec9-8dd0-adc4db771821" containerName="mariadb-database-create" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.165952 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1b1baed-f9ed-4ec9-8dd0-adc4db771821" containerName="mariadb-database-create" Mar 20 08:45:00 crc kubenswrapper[4903]: E0320 08:45:00.165979 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ddb7780-fcec-42d1-811e-5cc8a4169917" containerName="mariadb-account-create-update" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.165985 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ddb7780-fcec-42d1-811e-5cc8a4169917" containerName="mariadb-account-create-update" Mar 20 08:45:00 crc kubenswrapper[4903]: E0320 08:45:00.166000 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8abfc827-bdd5-43e9-877c-c3d611fc463e" containerName="mariadb-account-create-update" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.166006 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8abfc827-bdd5-43e9-877c-c3d611fc463e" containerName="mariadb-account-create-update" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.166250 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1b1baed-f9ed-4ec9-8dd0-adc4db771821" containerName="mariadb-database-create" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.166273 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8abfc827-bdd5-43e9-877c-c3d611fc463e" containerName="mariadb-account-create-update" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.166286 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de2d5fb-9f92-4a35-8264-48353a33895a" containerName="dnsmasq-dns" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.166301 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ddb7780-fcec-42d1-811e-5cc8a4169917" containerName="mariadb-account-create-update" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.167022 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.171152 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.171848 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.191423 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96"] Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.206185 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zxbh6" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.213751 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8z25d" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.215883 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-35b8-account-create-update-pdgzs" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.252457 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv8z6\" (UniqueName: \"kubernetes.io/projected/8abfc827-bdd5-43e9-877c-c3d611fc463e-kube-api-access-hv8z6\") pod \"8abfc827-bdd5-43e9-877c-c3d611fc463e\" (UID: \"8abfc827-bdd5-43e9-877c-c3d611fc463e\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.252537 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljkr8\" (UniqueName: \"kubernetes.io/projected/7bc69384-efae-4c4f-be81-591b2cd17538-kube-api-access-ljkr8\") pod \"7bc69384-efae-4c4f-be81-591b2cd17538\" (UID: \"7bc69384-efae-4c4f-be81-591b2cd17538\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.252597 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-operator-scripts\") pod \"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07\" (UID: \"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.252622 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8abfc827-bdd5-43e9-877c-c3d611fc463e-operator-scripts\") pod \"8abfc827-bdd5-43e9-877c-c3d611fc463e\" (UID: \"8abfc827-bdd5-43e9-877c-c3d611fc463e\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.252664 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgxtg\" (UniqueName: \"kubernetes.io/projected/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-kube-api-access-tgxtg\") pod \"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07\" (UID: \"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.252685 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcz2q\" (UniqueName: \"kubernetes.io/projected/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-kube-api-access-wcz2q\") pod \"6214cfc3-afe8-4e2c-aafe-d59d16b108b5\" (UID: \"6214cfc3-afe8-4e2c-aafe-d59d16b108b5\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.253191 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a1287a5-35f8-4d1a-8a4d-e7b30c957c07" (UID: "1a1287a5-35f8-4d1a-8a4d-e7b30c957c07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.253192 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8abfc827-bdd5-43e9-877c-c3d611fc463e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8abfc827-bdd5-43e9-877c-c3d611fc463e" (UID: "8abfc827-bdd5-43e9-877c-c3d611fc463e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.253734 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-operator-scripts\") pod \"b1b1baed-f9ed-4ec9-8dd0-adc4db771821\" (UID: \"b1b1baed-f9ed-4ec9-8dd0-adc4db771821\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.253771 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc69384-efae-4c4f-be81-591b2cd17538-operator-scripts\") pod \"7bc69384-efae-4c4f-be81-591b2cd17538\" (UID: \"7bc69384-efae-4c4f-be81-591b2cd17538\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.253813 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8zjt\" (UniqueName: \"kubernetes.io/projected/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-kube-api-access-f8zjt\") pod \"b1b1baed-f9ed-4ec9-8dd0-adc4db771821\" (UID: \"b1b1baed-f9ed-4ec9-8dd0-adc4db771821\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.253847 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-operator-scripts\") pod \"6214cfc3-afe8-4e2c-aafe-d59d16b108b5\" (UID: \"6214cfc3-afe8-4e2c-aafe-d59d16b108b5\") " Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.254136 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4291455-2eba-41a8-863f-ae9437f489fa-secret-volume\") pod \"collect-profiles-29566605-zfl96\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.254196 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4291455-2eba-41a8-863f-ae9437f489fa-config-volume\") pod \"collect-profiles-29566605-zfl96\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.254258 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmcxl\" (UniqueName: \"kubernetes.io/projected/d4291455-2eba-41a8-863f-ae9437f489fa-kube-api-access-nmcxl\") pod \"collect-profiles-29566605-zfl96\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.254369 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.254387 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8abfc827-bdd5-43e9-877c-c3d611fc463e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.254494 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc69384-efae-4c4f-be81-591b2cd17538-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bc69384-efae-4c4f-be81-591b2cd17538" (UID: "7bc69384-efae-4c4f-be81-591b2cd17538"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.254484 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1b1baed-f9ed-4ec9-8dd0-adc4db771821" (UID: "b1b1baed-f9ed-4ec9-8dd0-adc4db771821"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.254775 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6214cfc3-afe8-4e2c-aafe-d59d16b108b5" (UID: "6214cfc3-afe8-4e2c-aafe-d59d16b108b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.259980 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-kube-api-access-f8zjt" (OuterVolumeSpecName: "kube-api-access-f8zjt") pod "b1b1baed-f9ed-4ec9-8dd0-adc4db771821" (UID: "b1b1baed-f9ed-4ec9-8dd0-adc4db771821"). InnerVolumeSpecName "kube-api-access-f8zjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.260922 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-kube-api-access-tgxtg" (OuterVolumeSpecName: "kube-api-access-tgxtg") pod "1a1287a5-35f8-4d1a-8a4d-e7b30c957c07" (UID: "1a1287a5-35f8-4d1a-8a4d-e7b30c957c07"). InnerVolumeSpecName "kube-api-access-tgxtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.263054 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-kube-api-access-wcz2q" (OuterVolumeSpecName: "kube-api-access-wcz2q") pod "6214cfc3-afe8-4e2c-aafe-d59d16b108b5" (UID: "6214cfc3-afe8-4e2c-aafe-d59d16b108b5"). InnerVolumeSpecName "kube-api-access-wcz2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.265293 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc69384-efae-4c4f-be81-591b2cd17538-kube-api-access-ljkr8" (OuterVolumeSpecName: "kube-api-access-ljkr8") pod "7bc69384-efae-4c4f-be81-591b2cd17538" (UID: "7bc69384-efae-4c4f-be81-591b2cd17538"). InnerVolumeSpecName "kube-api-access-ljkr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.265331 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8abfc827-bdd5-43e9-877c-c3d611fc463e-kube-api-access-hv8z6" (OuterVolumeSpecName: "kube-api-access-hv8z6") pod "8abfc827-bdd5-43e9-877c-c3d611fc463e" (UID: "8abfc827-bdd5-43e9-877c-c3d611fc463e"). InnerVolumeSpecName "kube-api-access-hv8z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.355783 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4291455-2eba-41a8-863f-ae9437f489fa-secret-volume\") pod \"collect-profiles-29566605-zfl96\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.356177 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4291455-2eba-41a8-863f-ae9437f489fa-config-volume\") pod \"collect-profiles-29566605-zfl96\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.356256 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmcxl\" (UniqueName: \"kubernetes.io/projected/d4291455-2eba-41a8-863f-ae9437f489fa-kube-api-access-nmcxl\") pod \"collect-profiles-29566605-zfl96\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.356362 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.356374 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bc69384-efae-4c4f-be81-591b2cd17538-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.356383 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8zjt\" (UniqueName: \"kubernetes.io/projected/b1b1baed-f9ed-4ec9-8dd0-adc4db771821-kube-api-access-f8zjt\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.356395 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.356405 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv8z6\" (UniqueName: \"kubernetes.io/projected/8abfc827-bdd5-43e9-877c-c3d611fc463e-kube-api-access-hv8z6\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.356414 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljkr8\" (UniqueName: \"kubernetes.io/projected/7bc69384-efae-4c4f-be81-591b2cd17538-kube-api-access-ljkr8\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.356423 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgxtg\" (UniqueName: \"kubernetes.io/projected/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07-kube-api-access-tgxtg\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.356432 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcz2q\" (UniqueName: \"kubernetes.io/projected/6214cfc3-afe8-4e2c-aafe-d59d16b108b5-kube-api-access-wcz2q\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.360027 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4291455-2eba-41a8-863f-ae9437f489fa-config-volume\") pod \"collect-profiles-29566605-zfl96\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.369697 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4291455-2eba-41a8-863f-ae9437f489fa-secret-volume\") pod \"collect-profiles-29566605-zfl96\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.374549 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmcxl\" (UniqueName: \"kubernetes.io/projected/d4291455-2eba-41a8-863f-ae9437f489fa-kube-api-access-nmcxl\") pod \"collect-profiles-29566605-zfl96\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.488858 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.543646 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-35b8-account-create-update-pdgzs" event={"ID":"6214cfc3-afe8-4e2c-aafe-d59d16b108b5","Type":"ContainerDied","Data":"b284ad1995a656e0e9d65fff304f64af8888aed5b85bf9b3d389622685961e88"} Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.543689 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b284ad1995a656e0e9d65fff304f64af8888aed5b85bf9b3d389622685961e88" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.543748 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-35b8-account-create-update-pdgzs" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.555696 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0a04-account-create-update-br8cg" event={"ID":"0ddb7780-fcec-42d1-811e-5cc8a4169917","Type":"ContainerDied","Data":"efb5a0ac8d805a42d55384cb31ea0df86d41974ed723b438d1b5bd9ac4dc5a33"} Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.555738 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efb5a0ac8d805a42d55384cb31ea0df86d41974ed723b438d1b5bd9ac4dc5a33" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.555842 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0a04-account-create-update-br8cg" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.566817 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e4e-account-create-update-5n652" event={"ID":"8abfc827-bdd5-43e9-877c-c3d611fc463e","Type":"ContainerDied","Data":"076aec6a57cb3273e082b91b0df068e2c1841eda1e513d6187c54c4ac0ab4978"} Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.566863 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="076aec6a57cb3273e082b91b0df068e2c1841eda1e513d6187c54c4ac0ab4978" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.566936 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e4e-account-create-update-5n652" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.602798 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qq45l" event={"ID":"b1b1baed-f9ed-4ec9-8dd0-adc4db771821","Type":"ContainerDied","Data":"d4cca650711b31bea467f367fc2149975bc9d4deb53b1c41a0c62135a69a7f2a"} Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.602844 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4cca650711b31bea467f367fc2149975bc9d4deb53b1c41a0c62135a69a7f2a" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.602923 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qq45l" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.623916 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zxbh6" event={"ID":"7bc69384-efae-4c4f-be81-591b2cd17538","Type":"ContainerDied","Data":"0ebf152f7bb1da34751e6525c1bed6839c82b3f6472ca1fc7cb00fc8471b3f15"} Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.623965 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ebf152f7bb1da34751e6525c1bed6839c82b3f6472ca1fc7cb00fc8471b3f15" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.624046 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zxbh6" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.630432 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8z25d" event={"ID":"1a1287a5-35f8-4d1a-8a4d-e7b30c957c07","Type":"ContainerDied","Data":"fc76b13e1d70ac4b656efe57cc839d97622817c718b4375fbf5b8dbdeaf8d37d"} Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.630577 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc76b13e1d70ac4b656efe57cc839d97622817c718b4375fbf5b8dbdeaf8d37d" Mar 20 08:45:00 crc kubenswrapper[4903]: I0320 08:45:00.630701 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8z25d" Mar 20 08:45:01 crc kubenswrapper[4903]: I0320 08:45:01.513788 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de2d5fb-9f92-4a35-8264-48353a33895a" path="/var/lib/kubelet/pods/6de2d5fb-9f92-4a35-8264-48353a33895a/volumes" Mar 20 08:45:03 crc kubenswrapper[4903]: I0320 08:45:03.847701 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96"] Mar 20 08:45:04 crc kubenswrapper[4903]: I0320 08:45:04.670572 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kt4gk" event={"ID":"feeda47f-bf82-4e99-a704-b405a817bb1d","Type":"ContainerStarted","Data":"dbce3ffc947f1f39ec5ef071fa419c575a3bcb33a8993c03be727d2e31281d58"} Mar 20 08:45:04 crc kubenswrapper[4903]: I0320 08:45:04.673335 4903 generic.go:334] "Generic (PLEG): container finished" podID="d4291455-2eba-41a8-863f-ae9437f489fa" containerID="5bd11fcc8be2dd4fe83248c58004973b1c1be546b7467fd89ebc25a537120624" exitCode=0 Mar 20 08:45:04 crc kubenswrapper[4903]: I0320 08:45:04.673389 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" event={"ID":"d4291455-2eba-41a8-863f-ae9437f489fa","Type":"ContainerDied","Data":"5bd11fcc8be2dd4fe83248c58004973b1c1be546b7467fd89ebc25a537120624"} Mar 20 08:45:04 crc kubenswrapper[4903]: I0320 08:45:04.673417 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" event={"ID":"d4291455-2eba-41a8-863f-ae9437f489fa","Type":"ContainerStarted","Data":"bbc2aab5615dc7b586d29ceb9e814824ace91aa3f9d93534788304b78cad9c1f"} Mar 20 08:45:04 crc kubenswrapper[4903]: I0320 08:45:04.691554 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kt4gk" podStartSLOduration=2.868083769 podStartE2EDuration="8.691537246s" podCreationTimestamp="2026-03-20 08:44:56 +0000 UTC" firstStartedPulling="2026-03-20 08:44:57.62419887 +0000 UTC m=+1322.841099195" lastFinishedPulling="2026-03-20 08:45:03.447652357 +0000 UTC m=+1328.664552672" observedRunningTime="2026-03-20 08:45:04.688636836 +0000 UTC m=+1329.905537151" watchObservedRunningTime="2026-03-20 08:45:04.691537246 +0000 UTC m=+1329.908437561" Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.058552 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.164070 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4291455-2eba-41a8-863f-ae9437f489fa-config-volume\") pod \"d4291455-2eba-41a8-863f-ae9437f489fa\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.164217 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmcxl\" (UniqueName: \"kubernetes.io/projected/d4291455-2eba-41a8-863f-ae9437f489fa-kube-api-access-nmcxl\") pod \"d4291455-2eba-41a8-863f-ae9437f489fa\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.164284 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4291455-2eba-41a8-863f-ae9437f489fa-secret-volume\") pod \"d4291455-2eba-41a8-863f-ae9437f489fa\" (UID: \"d4291455-2eba-41a8-863f-ae9437f489fa\") " Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.165822 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4291455-2eba-41a8-863f-ae9437f489fa-config-volume" (OuterVolumeSpecName: "config-volume") pod "d4291455-2eba-41a8-863f-ae9437f489fa" (UID: "d4291455-2eba-41a8-863f-ae9437f489fa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.172079 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4291455-2eba-41a8-863f-ae9437f489fa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d4291455-2eba-41a8-863f-ae9437f489fa" (UID: "d4291455-2eba-41a8-863f-ae9437f489fa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.172944 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4291455-2eba-41a8-863f-ae9437f489fa-kube-api-access-nmcxl" (OuterVolumeSpecName: "kube-api-access-nmcxl") pod "d4291455-2eba-41a8-863f-ae9437f489fa" (UID: "d4291455-2eba-41a8-863f-ae9437f489fa"). InnerVolumeSpecName "kube-api-access-nmcxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.266935 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmcxl\" (UniqueName: \"kubernetes.io/projected/d4291455-2eba-41a8-863f-ae9437f489fa-kube-api-access-nmcxl\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.267003 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4291455-2eba-41a8-863f-ae9437f489fa-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.267023 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4291455-2eba-41a8-863f-ae9437f489fa-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.694556 4903 generic.go:334] "Generic (PLEG): container finished" podID="feeda47f-bf82-4e99-a704-b405a817bb1d" containerID="dbce3ffc947f1f39ec5ef071fa419c575a3bcb33a8993c03be727d2e31281d58" exitCode=0 Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.694630 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kt4gk" event={"ID":"feeda47f-bf82-4e99-a704-b405a817bb1d","Type":"ContainerDied","Data":"dbce3ffc947f1f39ec5ef071fa419c575a3bcb33a8993c03be727d2e31281d58"} Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.697201 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" event={"ID":"d4291455-2eba-41a8-863f-ae9437f489fa","Type":"ContainerDied","Data":"bbc2aab5615dc7b586d29ceb9e814824ace91aa3f9d93534788304b78cad9c1f"} Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.697228 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbc2aab5615dc7b586d29ceb9e814824ace91aa3f9d93534788304b78cad9c1f" Mar 20 08:45:06 crc kubenswrapper[4903]: I0320 08:45:06.697331 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566605-zfl96" Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.133246 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.208367 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk5d8\" (UniqueName: \"kubernetes.io/projected/feeda47f-bf82-4e99-a704-b405a817bb1d-kube-api-access-nk5d8\") pod \"feeda47f-bf82-4e99-a704-b405a817bb1d\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.208500 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-combined-ca-bundle\") pod \"feeda47f-bf82-4e99-a704-b405a817bb1d\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.208547 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-config-data\") pod \"feeda47f-bf82-4e99-a704-b405a817bb1d\" (UID: \"feeda47f-bf82-4e99-a704-b405a817bb1d\") " Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.215963 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feeda47f-bf82-4e99-a704-b405a817bb1d-kube-api-access-nk5d8" (OuterVolumeSpecName: "kube-api-access-nk5d8") pod "feeda47f-bf82-4e99-a704-b405a817bb1d" (UID: "feeda47f-bf82-4e99-a704-b405a817bb1d"). InnerVolumeSpecName "kube-api-access-nk5d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.234017 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "feeda47f-bf82-4e99-a704-b405a817bb1d" (UID: "feeda47f-bf82-4e99-a704-b405a817bb1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.251491 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-config-data" (OuterVolumeSpecName: "config-data") pod "feeda47f-bf82-4e99-a704-b405a817bb1d" (UID: "feeda47f-bf82-4e99-a704-b405a817bb1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.310659 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.310710 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feeda47f-bf82-4e99-a704-b405a817bb1d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.310727 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk5d8\" (UniqueName: \"kubernetes.io/projected/feeda47f-bf82-4e99-a704-b405a817bb1d-kube-api-access-nk5d8\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.726185 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kt4gk" event={"ID":"feeda47f-bf82-4e99-a704-b405a817bb1d","Type":"ContainerDied","Data":"e0b87b0989dfe1d272e002dfdb9821facdcd560093625b1b6fb938d1b9f76288"} Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.726737 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0b87b0989dfe1d272e002dfdb9821facdcd560093625b1b6fb938d1b9f76288" Mar 20 08:45:08 crc kubenswrapper[4903]: I0320 08:45:08.726302 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kt4gk" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.095228 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gcq77"] Mar 20 08:45:09 crc kubenswrapper[4903]: E0320 08:45:09.095764 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4291455-2eba-41a8-863f-ae9437f489fa" containerName="collect-profiles" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.095787 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4291455-2eba-41a8-863f-ae9437f489fa" containerName="collect-profiles" Mar 20 08:45:09 crc kubenswrapper[4903]: E0320 08:45:09.095811 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a1287a5-35f8-4d1a-8a4d-e7b30c957c07" containerName="mariadb-database-create" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.095819 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1287a5-35f8-4d1a-8a4d-e7b30c957c07" containerName="mariadb-database-create" Mar 20 08:45:09 crc kubenswrapper[4903]: E0320 08:45:09.095849 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc69384-efae-4c4f-be81-591b2cd17538" containerName="mariadb-database-create" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.095858 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc69384-efae-4c4f-be81-591b2cd17538" containerName="mariadb-database-create" Mar 20 08:45:09 crc kubenswrapper[4903]: E0320 08:45:09.095870 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feeda47f-bf82-4e99-a704-b405a817bb1d" containerName="keystone-db-sync" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.095876 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="feeda47f-bf82-4e99-a704-b405a817bb1d" containerName="keystone-db-sync" Mar 20 08:45:09 crc kubenswrapper[4903]: E0320 08:45:09.095900 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6214cfc3-afe8-4e2c-aafe-d59d16b108b5" containerName="mariadb-account-create-update" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.095907 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6214cfc3-afe8-4e2c-aafe-d59d16b108b5" containerName="mariadb-account-create-update" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.096162 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="feeda47f-bf82-4e99-a704-b405a817bb1d" containerName="keystone-db-sync" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.096178 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc69384-efae-4c4f-be81-591b2cd17538" containerName="mariadb-database-create" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.096191 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6214cfc3-afe8-4e2c-aafe-d59d16b108b5" containerName="mariadb-account-create-update" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.096202 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a1287a5-35f8-4d1a-8a4d-e7b30c957c07" containerName="mariadb-database-create" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.096213 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4291455-2eba-41a8-863f-ae9437f489fa" containerName="collect-profiles" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.097465 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.113003 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hzcp9"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.124845 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.129542 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.137260 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-svc\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.137345 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.137379 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-config\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.137410 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.137574 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbnh8\" (UniqueName: \"kubernetes.io/projected/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-kube-api-access-xbnh8\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.137601 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.143611 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.144099 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rzqbd" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.144789 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.149021 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.209846 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gcq77"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239204 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-svc\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239268 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239294 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-scripts\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239314 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-config\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239333 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239379 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-credential-keys\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239424 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-config-data\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239447 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbnh8\" (UniqueName: \"kubernetes.io/projected/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-kube-api-access-xbnh8\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239465 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239485 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx76d\" (UniqueName: \"kubernetes.io/projected/0b5652ae-2905-486a-897b-27c2b70bab5d-kube-api-access-cx76d\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239510 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-fernet-keys\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.239526 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-combined-ca-bundle\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.240450 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-svc\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.241060 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.241320 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.241676 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-config\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.241873 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.248420 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hzcp9"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.292471 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbnh8\" (UniqueName: \"kubernetes.io/projected/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-kube-api-access-xbnh8\") pod \"dnsmasq-dns-847c4cc679-gcq77\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.329438 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-f4rg2"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.330452 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.341524 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-scripts\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.341629 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-credential-keys\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.341692 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-config-data\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.341732 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx76d\" (UniqueName: \"kubernetes.io/projected/0b5652ae-2905-486a-897b-27c2b70bab5d-kube-api-access-cx76d\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.341760 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-fernet-keys\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.341779 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-combined-ca-bundle\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.342254 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.342657 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fm82s" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.354284 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.357595 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-credential-keys\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.360713 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-combined-ca-bundle\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.361026 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-scripts\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.376907 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-config-data\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.385026 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-fernet-keys\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.397543 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f4rg2"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.435391 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx76d\" (UniqueName: \"kubernetes.io/projected/0b5652ae-2905-486a-897b-27c2b70bab5d-kube-api-access-cx76d\") pod \"keystone-bootstrap-hzcp9\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.435732 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.437782 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.443085 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-db-sync-config-data\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.443148 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-combined-ca-bundle\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.443178 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcqt7\" (UniqueName: \"kubernetes.io/projected/6e210f8c-e29d-442c-a5eb-ec6b639b0275-kube-api-access-kcqt7\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.443227 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e210f8c-e29d-442c-a5eb-ec6b639b0275-etc-machine-id\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.443278 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-scripts\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.443310 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-config-data\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.444700 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.455696 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.456357 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.457176 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.554695 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-run-httpd\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555251 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555278 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-log-httpd\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555340 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djzj6\" (UniqueName: \"kubernetes.io/projected/003c0ace-6aef-4bc2-bc02-358cf140d4ce-kube-api-access-djzj6\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555451 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-scripts\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555562 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-scripts\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555606 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-config-data\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555685 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-db-sync-config-data\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555756 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-combined-ca-bundle\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555785 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-config-data\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555821 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcqt7\" (UniqueName: \"kubernetes.io/projected/6e210f8c-e29d-442c-a5eb-ec6b639b0275-kube-api-access-kcqt7\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555943 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.555987 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e210f8c-e29d-442c-a5eb-ec6b639b0275-etc-machine-id\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.556127 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e210f8c-e29d-442c-a5eb-ec6b639b0275-etc-machine-id\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.571208 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-db-sync-config-data\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.572229 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-combined-ca-bundle\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.575892 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.576015 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xws82"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.577054 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gcq77"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.578223 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xws82"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.578297 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-d8wd9"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.577601 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.579166 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.585271 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.587339 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-d8wd9"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.588287 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcqt7\" (UniqueName: \"kubernetes.io/projected/6e210f8c-e29d-442c-a5eb-ec6b639b0275-kube-api-access-kcqt7\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.588964 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.589193 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.589477 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qpklm" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.589507 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pq9qx" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.590592 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-config-data\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.600580 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl9ms"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.602405 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.600669 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-scripts\") pod \"cinder-db-sync-f4rg2\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.605393 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ntkfl"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.606809 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.632586 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.633013 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.633510 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lbggm" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.655930 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl9ms"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.656967 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mxlf\" (UniqueName: \"kubernetes.io/projected/cddb5fee-92f5-463f-a746-8a58e0a05e4b-kube-api-access-7mxlf\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657001 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9kmk\" (UniqueName: \"kubernetes.io/projected/b95e8341-9f65-461f-891b-ec6512be57f7-kube-api-access-v9kmk\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657019 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwscl\" (UniqueName: \"kubernetes.io/projected/be07fcc9-d6b5-4551-8846-94aa14b6af5d-kube-api-access-mwscl\") pod \"barbican-db-sync-xws82\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657075 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-scripts\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657099 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657132 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657149 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-config\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657175 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-scripts\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657194 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-combined-ca-bundle\") pod \"neutron-db-sync-d8wd9\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657218 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-config-data\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657234 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-config-data\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657270 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-combined-ca-bundle\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657291 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657310 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-run-httpd\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657325 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657343 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-log-httpd\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657362 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrvk2\" (UniqueName: \"kubernetes.io/projected/e64f54bd-2813-41dd-86e6-9836da200d1c-kube-api-access-lrvk2\") pod \"neutron-db-sync-d8wd9\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657382 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cddb5fee-92f5-463f-a746-8a58e0a05e4b-logs\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657397 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-combined-ca-bundle\") pod \"barbican-db-sync-xws82\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657415 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657434 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-config\") pod \"neutron-db-sync-d8wd9\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657451 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djzj6\" (UniqueName: \"kubernetes.io/projected/003c0ace-6aef-4bc2-bc02-358cf140d4ce-kube-api-access-djzj6\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657468 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-db-sync-config-data\") pod \"barbican-db-sync-xws82\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.657487 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.658668 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-run-httpd\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.660814 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-log-httpd\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.661536 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-scripts\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.663899 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.663952 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.682545 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-config-data\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.695865 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ntkfl"] Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.711543 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djzj6\" (UniqueName: \"kubernetes.io/projected/003c0ace-6aef-4bc2-bc02-358cf140d4ce-kube-api-access-djzj6\") pod \"ceilometer-0\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.762016 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-combined-ca-bundle\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.762244 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrvk2\" (UniqueName: \"kubernetes.io/projected/e64f54bd-2813-41dd-86e6-9836da200d1c-kube-api-access-lrvk2\") pod \"neutron-db-sync-d8wd9\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.762322 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cddb5fee-92f5-463f-a746-8a58e0a05e4b-logs\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.762393 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-combined-ca-bundle\") pod \"barbican-db-sync-xws82\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.762458 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.762543 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-config\") pod \"neutron-db-sync-d8wd9\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.764709 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.765793 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-db-sync-config-data\") pod \"barbican-db-sync-xws82\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.765879 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.766015 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mxlf\" (UniqueName: \"kubernetes.io/projected/cddb5fee-92f5-463f-a746-8a58e0a05e4b-kube-api-access-7mxlf\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.768026 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwscl\" (UniqueName: \"kubernetes.io/projected/be07fcc9-d6b5-4551-8846-94aa14b6af5d-kube-api-access-mwscl\") pod \"barbican-db-sync-xws82\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.768098 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9kmk\" (UniqueName: \"kubernetes.io/projected/b95e8341-9f65-461f-891b-ec6512be57f7-kube-api-access-v9kmk\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.768162 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.768278 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.768334 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-config\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.768501 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-scripts\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.768608 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-combined-ca-bundle\") pod \"neutron-db-sync-d8wd9\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.768794 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-config-data\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.769552 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.769934 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cddb5fee-92f5-463f-a746-8a58e0a05e4b-logs\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.770949 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.772669 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.779934 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-config\") pod \"neutron-db-sync-d8wd9\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.781579 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-combined-ca-bundle\") pod \"neutron-db-sync-d8wd9\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.783512 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-config\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.784074 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.792866 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-db-sync-config-data\") pod \"barbican-db-sync-xws82\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.807870 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.821673 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-scripts\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.823260 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-combined-ca-bundle\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.824731 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-combined-ca-bundle\") pod \"barbican-db-sync-xws82\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.827752 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrvk2\" (UniqueName: \"kubernetes.io/projected/e64f54bd-2813-41dd-86e6-9836da200d1c-kube-api-access-lrvk2\") pod \"neutron-db-sync-d8wd9\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.840590 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mxlf\" (UniqueName: \"kubernetes.io/projected/cddb5fee-92f5-463f-a746-8a58e0a05e4b-kube-api-access-7mxlf\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.844677 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-config-data\") pod \"placement-db-sync-ntkfl\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.846223 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9kmk\" (UniqueName: \"kubernetes.io/projected/b95e8341-9f65-461f-891b-ec6512be57f7-kube-api-access-v9kmk\") pod \"dnsmasq-dns-785d8bcb8c-nl9ms\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.850606 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwscl\" (UniqueName: \"kubernetes.io/projected/be07fcc9-d6b5-4551-8846-94aa14b6af5d-kube-api-access-mwscl\") pod \"barbican-db-sync-xws82\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.883111 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:09 crc kubenswrapper[4903]: I0320 08:45:09.993743 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.018247 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.126238 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.226435 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.228817 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.245237 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4d2kn" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.245643 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.245663 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.246827 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.272659 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.307652 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hzcp9"] Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.350917 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.354270 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.359173 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.359369 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.362443 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445659 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445728 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445755 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfgll\" (UniqueName: \"kubernetes.io/projected/332e29b8-d365-4ff1-95df-53eaf60e156c-kube-api-access-mfgll\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445774 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-config-data\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445809 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-logs\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445834 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445866 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445910 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445930 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445950 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.445992 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-scripts\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.446018 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch7z5\" (UniqueName: \"kubernetes.io/projected/bbf1b572-28ed-4a97-9b60-6ddc143bae73-kube-api-access-ch7z5\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.446057 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.446178 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.446207 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.446228 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-logs\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553123 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553181 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553210 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-logs\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553269 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553311 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553337 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfgll\" (UniqueName: \"kubernetes.io/projected/332e29b8-d365-4ff1-95df-53eaf60e156c-kube-api-access-mfgll\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553357 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-config-data\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553428 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-logs\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553467 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553498 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553549 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553566 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553585 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553696 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-scripts\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553721 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch7z5\" (UniqueName: \"kubernetes.io/projected/bbf1b572-28ed-4a97-9b60-6ddc143bae73-kube-api-access-ch7z5\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.553742 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.555442 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.557418 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f4rg2"] Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.557589 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.566644 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-logs\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.568902 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.570766 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.582449 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-logs\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.592830 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.599189 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-scripts\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.599907 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.606590 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-config-data\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.615609 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.628483 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.629739 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.633933 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfgll\" (UniqueName: \"kubernetes.io/projected/332e29b8-d365-4ff1-95df-53eaf60e156c-kube-api-access-mfgll\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.650835 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.658153 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch7z5\" (UniqueName: \"kubernetes.io/projected/bbf1b572-28ed-4a97-9b60-6ddc143bae73-kube-api-access-ch7z5\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.675987 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gcq77"] Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.677212 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.697234 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.763234 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hzcp9" event={"ID":"0b5652ae-2905-486a-897b-27c2b70bab5d","Type":"ContainerStarted","Data":"45d2c3adc326d7d8f94889db4b80476031f46f92611cb43a0f7124e4b312f5ff"} Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.767345 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ntkfl"] Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.770130 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f4rg2" event={"ID":"6e210f8c-e29d-442c-a5eb-ec6b639b0275","Type":"ContainerStarted","Data":"b87dcd8ea23a18ff428c9a97b4ca1d761b85d9e4ce0359216dcf159ca440b6e2"} Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.771413 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-gcq77" event={"ID":"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a","Type":"ContainerStarted","Data":"c0fe9fd6d3291c26f9f82aeebdb0a7a87d7a6dfa6d1f19913f6ae8c90ff8ba7c"} Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.808968 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.822687 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.831016 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.939424 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl9ms"] Mar 20 08:45:10 crc kubenswrapper[4903]: I0320 08:45:10.953381 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xws82"] Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.119289 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-d8wd9"] Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.486941 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.791004 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d8wd9" event={"ID":"e64f54bd-2813-41dd-86e6-9836da200d1c","Type":"ContainerStarted","Data":"4f489dcee69a113289e6c636cdf34b9bb86247d0274a95e46fc1731cbaeacb45"} Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.791092 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d8wd9" event={"ID":"e64f54bd-2813-41dd-86e6-9836da200d1c","Type":"ContainerStarted","Data":"76a4774f95f98ed62297c376206d5783ba9e0c4149b8181875ce05fe96c28d2e"} Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.797128 4903 generic.go:334] "Generic (PLEG): container finished" podID="d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" containerID="e88d0811b25d8a3f6f177d5c94c1870220acba5fdb8a6e3b9d20946d6ef5f1fd" exitCode=0 Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.797288 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-gcq77" event={"ID":"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a","Type":"ContainerDied","Data":"e88d0811b25d8a3f6f177d5c94c1870220acba5fdb8a6e3b9d20946d6ef5f1fd"} Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.802825 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bbf1b572-28ed-4a97-9b60-6ddc143bae73","Type":"ContainerStarted","Data":"ee062e9b8cbd7be779e17d127dddbf6d6113a4ee8fda5b7df1050043eea7c8eb"} Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.810477 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xws82" event={"ID":"be07fcc9-d6b5-4551-8846-94aa14b6af5d","Type":"ContainerStarted","Data":"5480dc9a3a875b7c03b240c859141e92173003eede89c918d80eb10ff67f9ffe"} Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.814974 4903 generic.go:334] "Generic (PLEG): container finished" podID="b95e8341-9f65-461f-891b-ec6512be57f7" containerID="e0f512692c5d2627130fb72028dde3138ce12d203c6f16dd5bb9fe2db0ecf739" exitCode=0 Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.815245 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" event={"ID":"b95e8341-9f65-461f-891b-ec6512be57f7","Type":"ContainerDied","Data":"e0f512692c5d2627130fb72028dde3138ce12d203c6f16dd5bb9fe2db0ecf739"} Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.815272 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" event={"ID":"b95e8341-9f65-461f-891b-ec6512be57f7","Type":"ContainerStarted","Data":"6fbdff3a1b304d2fc79262f059cb1b0d998e9205aa0ca07bd62ea57418425d9f"} Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.816991 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-d8wd9" podStartSLOduration=2.816970502 podStartE2EDuration="2.816970502s" podCreationTimestamp="2026-03-20 08:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:11.812463483 +0000 UTC m=+1337.029363788" watchObservedRunningTime="2026-03-20 08:45:11.816970502 +0000 UTC m=+1337.033870817" Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.819150 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ntkfl" event={"ID":"cddb5fee-92f5-463f-a746-8a58e0a05e4b","Type":"ContainerStarted","Data":"093eca01661775c1cb2b425f19468c01508af52da1982590f7e538f20fb8093e"} Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.823124 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hzcp9" event={"ID":"0b5652ae-2905-486a-897b-27c2b70bab5d","Type":"ContainerStarted","Data":"f63032427faa40de3f4e75fa0bf189cc5949d6ee6ff43b3dc51e606412311d6f"} Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.841528 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"003c0ace-6aef-4bc2-bc02-358cf140d4ce","Type":"ContainerStarted","Data":"657b5af52a5448870f3b62e12d359dc18bbcffdb6f93c897dcf113e70b47bccf"} Mar 20 08:45:11 crc kubenswrapper[4903]: I0320 08:45:11.852993 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hzcp9" podStartSLOduration=2.852973485 podStartE2EDuration="2.852973485s" podCreationTimestamp="2026-03-20 08:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:11.846684383 +0000 UTC m=+1337.063584698" watchObservedRunningTime="2026-03-20 08:45:11.852973485 +0000 UTC m=+1337.069873800" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.248855 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.276933 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.342432 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.359542 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.404854 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbnh8\" (UniqueName: \"kubernetes.io/projected/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-kube-api-access-xbnh8\") pod \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.404915 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-swift-storage-0\") pod \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.405088 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-config\") pod \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.405115 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-nb\") pod \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.405152 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-sb\") pod \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.405193 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-svc\") pod \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\" (UID: \"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a\") " Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.452417 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-kube-api-access-xbnh8" (OuterVolumeSpecName: "kube-api-access-xbnh8") pod "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" (UID: "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a"). InnerVolumeSpecName "kube-api-access-xbnh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.456717 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-config" (OuterVolumeSpecName: "config") pod "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" (UID: "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.483414 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" (UID: "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.485852 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" (UID: "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.486967 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" (UID: "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.503984 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" (UID: "d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.511013 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.511152 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.511165 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.511174 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.511187 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbnh8\" (UniqueName: \"kubernetes.io/projected/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-kube-api-access-xbnh8\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.511214 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.535806 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.877005 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-gcq77" event={"ID":"d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a","Type":"ContainerDied","Data":"c0fe9fd6d3291c26f9f82aeebdb0a7a87d7a6dfa6d1f19913f6ae8c90ff8ba7c"} Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.877361 4903 scope.go:117] "RemoveContainer" containerID="e88d0811b25d8a3f6f177d5c94c1870220acba5fdb8a6e3b9d20946d6ef5f1fd" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.877021 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-gcq77" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.908718 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bbf1b572-28ed-4a97-9b60-6ddc143bae73","Type":"ContainerStarted","Data":"b87946e7b95c0cbb432479ee2394896543f018b5b04a522866d75dcac1db9682"} Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.954077 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gcq77"] Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.964243 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-gcq77"] Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.964957 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" event={"ID":"b95e8341-9f65-461f-891b-ec6512be57f7","Type":"ContainerStarted","Data":"49c1a0f1b9ba0addd74e68afa439b435d66004b7b86a4a212551513668453686"} Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.965900 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.966995 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"332e29b8-d365-4ff1-95df-53eaf60e156c","Type":"ContainerStarted","Data":"daa42e1853c6ef754996dc5659490c63f96f564441924aa0dc458dd5ec91a125"} Mar 20 08:45:12 crc kubenswrapper[4903]: I0320 08:45:12.989302 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" podStartSLOduration=3.989282557 podStartE2EDuration="3.989282557s" podCreationTimestamp="2026-03-20 08:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:12.978999337 +0000 UTC m=+1338.195899652" watchObservedRunningTime="2026-03-20 08:45:12.989282557 +0000 UTC m=+1338.206182872" Mar 20 08:45:13 crc kubenswrapper[4903]: I0320 08:45:13.511974 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" path="/var/lib/kubelet/pods/d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a/volumes" Mar 20 08:45:13 crc kubenswrapper[4903]: I0320 08:45:13.990276 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"332e29b8-d365-4ff1-95df-53eaf60e156c","Type":"ContainerStarted","Data":"6d611cc60838132e03f73df67cee352686be2318ac7608a108094029b5d324fa"} Mar 20 08:45:14 crc kubenswrapper[4903]: I0320 08:45:14.003331 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bbf1b572-28ed-4a97-9b60-6ddc143bae73","Type":"ContainerStarted","Data":"7edb85794d7de0f5024eb593cc68a0a59b22185405b47af008fe95e88836b855"} Mar 20 08:45:14 crc kubenswrapper[4903]: I0320 08:45:14.003415 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" containerName="glance-log" containerID="cri-o://b87946e7b95c0cbb432479ee2394896543f018b5b04a522866d75dcac1db9682" gracePeriod=30 Mar 20 08:45:14 crc kubenswrapper[4903]: I0320 08:45:14.004257 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" containerName="glance-httpd" containerID="cri-o://7edb85794d7de0f5024eb593cc68a0a59b22185405b47af008fe95e88836b855" gracePeriod=30 Mar 20 08:45:14 crc kubenswrapper[4903]: I0320 08:45:14.038752 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.038724972 podStartE2EDuration="5.038724972s" podCreationTimestamp="2026-03-20 08:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:14.023626776 +0000 UTC m=+1339.240527091" watchObservedRunningTime="2026-03-20 08:45:14.038724972 +0000 UTC m=+1339.255625287" Mar 20 08:45:15 crc kubenswrapper[4903]: I0320 08:45:15.017229 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"332e29b8-d365-4ff1-95df-53eaf60e156c","Type":"ContainerStarted","Data":"85fb607cf7600a38f9e42b0306e55aeba29589e2f87349dc457bfdd31da6c05c"} Mar 20 08:45:15 crc kubenswrapper[4903]: I0320 08:45:15.017824 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="332e29b8-d365-4ff1-95df-53eaf60e156c" containerName="glance-httpd" containerID="cri-o://85fb607cf7600a38f9e42b0306e55aeba29589e2f87349dc457bfdd31da6c05c" gracePeriod=30 Mar 20 08:45:15 crc kubenswrapper[4903]: I0320 08:45:15.017785 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="332e29b8-d365-4ff1-95df-53eaf60e156c" containerName="glance-log" containerID="cri-o://6d611cc60838132e03f73df67cee352686be2318ac7608a108094029b5d324fa" gracePeriod=30 Mar 20 08:45:15 crc kubenswrapper[4903]: I0320 08:45:15.023099 4903 generic.go:334] "Generic (PLEG): container finished" podID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" containerID="7edb85794d7de0f5024eb593cc68a0a59b22185405b47af008fe95e88836b855" exitCode=0 Mar 20 08:45:15 crc kubenswrapper[4903]: I0320 08:45:15.023124 4903 generic.go:334] "Generic (PLEG): container finished" podID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" containerID="b87946e7b95c0cbb432479ee2394896543f018b5b04a522866d75dcac1db9682" exitCode=143 Mar 20 08:45:15 crc kubenswrapper[4903]: I0320 08:45:15.023165 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bbf1b572-28ed-4a97-9b60-6ddc143bae73","Type":"ContainerDied","Data":"7edb85794d7de0f5024eb593cc68a0a59b22185405b47af008fe95e88836b855"} Mar 20 08:45:15 crc kubenswrapper[4903]: I0320 08:45:15.023195 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bbf1b572-28ed-4a97-9b60-6ddc143bae73","Type":"ContainerDied","Data":"b87946e7b95c0cbb432479ee2394896543f018b5b04a522866d75dcac1db9682"} Mar 20 08:45:15 crc kubenswrapper[4903]: I0320 08:45:15.038999 4903 generic.go:334] "Generic (PLEG): container finished" podID="0b5652ae-2905-486a-897b-27c2b70bab5d" containerID="f63032427faa40de3f4e75fa0bf189cc5949d6ee6ff43b3dc51e606412311d6f" exitCode=0 Mar 20 08:45:15 crc kubenswrapper[4903]: I0320 08:45:15.040712 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hzcp9" event={"ID":"0b5652ae-2905-486a-897b-27c2b70bab5d","Type":"ContainerDied","Data":"f63032427faa40de3f4e75fa0bf189cc5949d6ee6ff43b3dc51e606412311d6f"} Mar 20 08:45:15 crc kubenswrapper[4903]: I0320 08:45:15.097843 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.097816641 podStartE2EDuration="6.097816641s" podCreationTimestamp="2026-03-20 08:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:15.042307755 +0000 UTC m=+1340.259208070" watchObservedRunningTime="2026-03-20 08:45:15.097816641 +0000 UTC m=+1340.314716956" Mar 20 08:45:16 crc kubenswrapper[4903]: I0320 08:45:16.058179 4903 generic.go:334] "Generic (PLEG): container finished" podID="332e29b8-d365-4ff1-95df-53eaf60e156c" containerID="85fb607cf7600a38f9e42b0306e55aeba29589e2f87349dc457bfdd31da6c05c" exitCode=0 Mar 20 08:45:16 crc kubenswrapper[4903]: I0320 08:45:16.058215 4903 generic.go:334] "Generic (PLEG): container finished" podID="332e29b8-d365-4ff1-95df-53eaf60e156c" containerID="6d611cc60838132e03f73df67cee352686be2318ac7608a108094029b5d324fa" exitCode=143 Mar 20 08:45:16 crc kubenswrapper[4903]: I0320 08:45:16.058371 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"332e29b8-d365-4ff1-95df-53eaf60e156c","Type":"ContainerDied","Data":"85fb607cf7600a38f9e42b0306e55aeba29589e2f87349dc457bfdd31da6c05c"} Mar 20 08:45:16 crc kubenswrapper[4903]: I0320 08:45:16.058399 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"332e29b8-d365-4ff1-95df-53eaf60e156c","Type":"ContainerDied","Data":"6d611cc60838132e03f73df67cee352686be2318ac7608a108094029b5d324fa"} Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.727649 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.825275 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.825348 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-httpd-run\") pod \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.825465 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-public-tls-certs\") pod \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.825696 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch7z5\" (UniqueName: \"kubernetes.io/projected/bbf1b572-28ed-4a97-9b60-6ddc143bae73-kube-api-access-ch7z5\") pod \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.825843 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-logs\") pod \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.825882 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bbf1b572-28ed-4a97-9b60-6ddc143bae73" (UID: "bbf1b572-28ed-4a97-9b60-6ddc143bae73"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.826980 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-logs" (OuterVolumeSpecName: "logs") pod "bbf1b572-28ed-4a97-9b60-6ddc143bae73" (UID: "bbf1b572-28ed-4a97-9b60-6ddc143bae73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.825879 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-combined-ca-bundle\") pod \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.827164 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-scripts\") pod \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.827214 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-config-data\") pod \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\" (UID: \"bbf1b572-28ed-4a97-9b60-6ddc143bae73\") " Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.827955 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.827976 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf1b572-28ed-4a97-9b60-6ddc143bae73-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.833132 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf1b572-28ed-4a97-9b60-6ddc143bae73-kube-api-access-ch7z5" (OuterVolumeSpecName: "kube-api-access-ch7z5") pod "bbf1b572-28ed-4a97-9b60-6ddc143bae73" (UID: "bbf1b572-28ed-4a97-9b60-6ddc143bae73"). InnerVolumeSpecName "kube-api-access-ch7z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.846886 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "bbf1b572-28ed-4a97-9b60-6ddc143bae73" (UID: "bbf1b572-28ed-4a97-9b60-6ddc143bae73"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.847124 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-scripts" (OuterVolumeSpecName: "scripts") pod "bbf1b572-28ed-4a97-9b60-6ddc143bae73" (UID: "bbf1b572-28ed-4a97-9b60-6ddc143bae73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.869126 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbf1b572-28ed-4a97-9b60-6ddc143bae73" (UID: "bbf1b572-28ed-4a97-9b60-6ddc143bae73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.885330 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-config-data" (OuterVolumeSpecName: "config-data") pod "bbf1b572-28ed-4a97-9b60-6ddc143bae73" (UID: "bbf1b572-28ed-4a97-9b60-6ddc143bae73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.913946 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bbf1b572-28ed-4a97-9b60-6ddc143bae73" (UID: "bbf1b572-28ed-4a97-9b60-6ddc143bae73"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.930657 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.930723 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.930747 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch7z5\" (UniqueName: \"kubernetes.io/projected/bbf1b572-28ed-4a97-9b60-6ddc143bae73-kube-api-access-ch7z5\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.930761 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.930773 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.930784 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf1b572-28ed-4a97-9b60-6ddc143bae73-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:18 crc kubenswrapper[4903]: I0320 08:45:18.956588 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.033539 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.087727 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bbf1b572-28ed-4a97-9b60-6ddc143bae73","Type":"ContainerDied","Data":"ee062e9b8cbd7be779e17d127dddbf6d6113a4ee8fda5b7df1050043eea7c8eb"} Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.087801 4903 scope.go:117] "RemoveContainer" containerID="7edb85794d7de0f5024eb593cc68a0a59b22185405b47af008fe95e88836b855" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.087800 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.124421 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.135878 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.155719 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:45:19 crc kubenswrapper[4903]: E0320 08:45:19.156235 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" containerName="glance-httpd" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.156258 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" containerName="glance-httpd" Mar 20 08:45:19 crc kubenswrapper[4903]: E0320 08:45:19.156286 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" containerName="glance-log" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.156294 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" containerName="glance-log" Mar 20 08:45:19 crc kubenswrapper[4903]: E0320 08:45:19.156321 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" containerName="init" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.156328 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" containerName="init" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.156553 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" containerName="glance-httpd" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.156576 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c9a0bd-5bbd-4422-8162-b5f680ee1c7a" containerName="init" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.156587 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" containerName="glance-log" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.157689 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.162541 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.162767 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.171811 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.237072 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.237605 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-config-data\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.237711 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-logs\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.237750 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.237973 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.238023 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fktf\" (UniqueName: \"kubernetes.io/projected/e761f2dc-041e-4811-802b-2a8e8c376381-kube-api-access-4fktf\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.238090 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-scripts\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.238111 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.341518 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.341595 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fktf\" (UniqueName: \"kubernetes.io/projected/e761f2dc-041e-4811-802b-2a8e8c376381-kube-api-access-4fktf\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.341635 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-scripts\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.341660 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.341695 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.341722 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-config-data\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.341775 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-logs\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.341806 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.342064 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.342992 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.343211 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-logs\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.348075 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.350153 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.350362 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-config-data\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.352774 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-scripts\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.369817 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fktf\" (UniqueName: \"kubernetes.io/projected/e761f2dc-041e-4811-802b-2a8e8c376381-kube-api-access-4fktf\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.374553 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.488891 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:45:19 crc kubenswrapper[4903]: I0320 08:45:19.506345 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf1b572-28ed-4a97-9b60-6ddc143bae73" path="/var/lib/kubelet/pods/bbf1b572-28ed-4a97-9b60-6ddc143bae73/volumes" Mar 20 08:45:20 crc kubenswrapper[4903]: I0320 08:45:20.128217 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:20 crc kubenswrapper[4903]: I0320 08:45:20.194840 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mk5xs"] Mar 20 08:45:20 crc kubenswrapper[4903]: I0320 08:45:20.196346 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" podUID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" containerName="dnsmasq-dns" containerID="cri-o://e79c33bcfd08b17843d391b6564c49280f4384378deac752dca59ae3cdecb9a5" gracePeriod=10 Mar 20 08:45:20 crc kubenswrapper[4903]: I0320 08:45:20.833679 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:45:20 crc kubenswrapper[4903]: I0320 08:45:20.834185 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.115904 4903 generic.go:334] "Generic (PLEG): container finished" podID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" containerID="e79c33bcfd08b17843d391b6564c49280f4384378deac752dca59ae3cdecb9a5" exitCode=0 Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.116054 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" event={"ID":"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d","Type":"ContainerDied","Data":"e79c33bcfd08b17843d391b6564c49280f4384378deac752dca59ae3cdecb9a5"} Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.694219 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.703108 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.832748 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-internal-tls-certs\") pod \"332e29b8-d365-4ff1-95df-53eaf60e156c\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.832801 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-credential-keys\") pod \"0b5652ae-2905-486a-897b-27c2b70bab5d\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.832842 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"332e29b8-d365-4ff1-95df-53eaf60e156c\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.832879 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-combined-ca-bundle\") pod \"0b5652ae-2905-486a-897b-27c2b70bab5d\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.832921 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-scripts\") pod \"332e29b8-d365-4ff1-95df-53eaf60e156c\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.832955 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-config-data\") pod \"332e29b8-d365-4ff1-95df-53eaf60e156c\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.832987 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-logs\") pod \"332e29b8-d365-4ff1-95df-53eaf60e156c\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.833054 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfgll\" (UniqueName: \"kubernetes.io/projected/332e29b8-d365-4ff1-95df-53eaf60e156c-kube-api-access-mfgll\") pod \"332e29b8-d365-4ff1-95df-53eaf60e156c\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.833924 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-logs" (OuterVolumeSpecName: "logs") pod "332e29b8-d365-4ff1-95df-53eaf60e156c" (UID: "332e29b8-d365-4ff1-95df-53eaf60e156c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.834286 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-config-data\") pod \"0b5652ae-2905-486a-897b-27c2b70bab5d\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.834324 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-scripts\") pod \"0b5652ae-2905-486a-897b-27c2b70bab5d\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.834378 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx76d\" (UniqueName: \"kubernetes.io/projected/0b5652ae-2905-486a-897b-27c2b70bab5d-kube-api-access-cx76d\") pod \"0b5652ae-2905-486a-897b-27c2b70bab5d\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.834407 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-fernet-keys\") pod \"0b5652ae-2905-486a-897b-27c2b70bab5d\" (UID: \"0b5652ae-2905-486a-897b-27c2b70bab5d\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.834508 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-combined-ca-bundle\") pod \"332e29b8-d365-4ff1-95df-53eaf60e156c\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.834540 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-httpd-run\") pod \"332e29b8-d365-4ff1-95df-53eaf60e156c\" (UID: \"332e29b8-d365-4ff1-95df-53eaf60e156c\") " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.835016 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.835310 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "332e29b8-d365-4ff1-95df-53eaf60e156c" (UID: "332e29b8-d365-4ff1-95df-53eaf60e156c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.840544 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0b5652ae-2905-486a-897b-27c2b70bab5d" (UID: "0b5652ae-2905-486a-897b-27c2b70bab5d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.840610 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/332e29b8-d365-4ff1-95df-53eaf60e156c-kube-api-access-mfgll" (OuterVolumeSpecName: "kube-api-access-mfgll") pod "332e29b8-d365-4ff1-95df-53eaf60e156c" (UID: "332e29b8-d365-4ff1-95df-53eaf60e156c"). InnerVolumeSpecName "kube-api-access-mfgll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.840571 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-scripts" (OuterVolumeSpecName: "scripts") pod "332e29b8-d365-4ff1-95df-53eaf60e156c" (UID: "332e29b8-d365-4ff1-95df-53eaf60e156c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.841629 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "332e29b8-d365-4ff1-95df-53eaf60e156c" (UID: "332e29b8-d365-4ff1-95df-53eaf60e156c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.846091 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-scripts" (OuterVolumeSpecName: "scripts") pod "0b5652ae-2905-486a-897b-27c2b70bab5d" (UID: "0b5652ae-2905-486a-897b-27c2b70bab5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.848742 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0b5652ae-2905-486a-897b-27c2b70bab5d" (UID: "0b5652ae-2905-486a-897b-27c2b70bab5d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.858271 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5652ae-2905-486a-897b-27c2b70bab5d-kube-api-access-cx76d" (OuterVolumeSpecName: "kube-api-access-cx76d") pod "0b5652ae-2905-486a-897b-27c2b70bab5d" (UID: "0b5652ae-2905-486a-897b-27c2b70bab5d"). InnerVolumeSpecName "kube-api-access-cx76d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.870431 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b5652ae-2905-486a-897b-27c2b70bab5d" (UID: "0b5652ae-2905-486a-897b-27c2b70bab5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.877534 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "332e29b8-d365-4ff1-95df-53eaf60e156c" (UID: "332e29b8-d365-4ff1-95df-53eaf60e156c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.892585 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-config-data" (OuterVolumeSpecName: "config-data") pod "332e29b8-d365-4ff1-95df-53eaf60e156c" (UID: "332e29b8-d365-4ff1-95df-53eaf60e156c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.898023 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-config-data" (OuterVolumeSpecName: "config-data") pod "0b5652ae-2905-486a-897b-27c2b70bab5d" (UID: "0b5652ae-2905-486a-897b-27c2b70bab5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.898691 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "332e29b8-d365-4ff1-95df-53eaf60e156c" (UID: "332e29b8-d365-4ff1-95df-53eaf60e156c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936764 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfgll\" (UniqueName: \"kubernetes.io/projected/332e29b8-d365-4ff1-95df-53eaf60e156c-kube-api-access-mfgll\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936811 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936822 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936831 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx76d\" (UniqueName: \"kubernetes.io/projected/0b5652ae-2905-486a-897b-27c2b70bab5d-kube-api-access-cx76d\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936840 4903 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936848 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936856 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/332e29b8-d365-4ff1-95df-53eaf60e156c-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936865 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936874 4903 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936914 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936922 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5652ae-2905-486a-897b-27c2b70bab5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936931 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.936939 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/332e29b8-d365-4ff1-95df-53eaf60e156c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:21 crc kubenswrapper[4903]: I0320 08:45:21.954518 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.038887 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.127424 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hzcp9" event={"ID":"0b5652ae-2905-486a-897b-27c2b70bab5d","Type":"ContainerDied","Data":"45d2c3adc326d7d8f94889db4b80476031f46f92611cb43a0f7124e4b312f5ff"} Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.127751 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45d2c3adc326d7d8f94889db4b80476031f46f92611cb43a0f7124e4b312f5ff" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.127817 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hzcp9" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.130413 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"332e29b8-d365-4ff1-95df-53eaf60e156c","Type":"ContainerDied","Data":"daa42e1853c6ef754996dc5659490c63f96f564441924aa0dc458dd5ec91a125"} Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.130500 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.167403 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.178213 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.206631 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:45:22 crc kubenswrapper[4903]: E0320 08:45:22.207091 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332e29b8-d365-4ff1-95df-53eaf60e156c" containerName="glance-httpd" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.207105 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="332e29b8-d365-4ff1-95df-53eaf60e156c" containerName="glance-httpd" Mar 20 08:45:22 crc kubenswrapper[4903]: E0320 08:45:22.207123 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332e29b8-d365-4ff1-95df-53eaf60e156c" containerName="glance-log" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.207130 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="332e29b8-d365-4ff1-95df-53eaf60e156c" containerName="glance-log" Mar 20 08:45:22 crc kubenswrapper[4903]: E0320 08:45:22.207155 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5652ae-2905-486a-897b-27c2b70bab5d" containerName="keystone-bootstrap" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.207161 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5652ae-2905-486a-897b-27c2b70bab5d" containerName="keystone-bootstrap" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.207335 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b5652ae-2905-486a-897b-27c2b70bab5d" containerName="keystone-bootstrap" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.207349 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="332e29b8-d365-4ff1-95df-53eaf60e156c" containerName="glance-httpd" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.207358 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="332e29b8-d365-4ff1-95df-53eaf60e156c" containerName="glance-log" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.208280 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.211137 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.211791 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.246274 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.342929 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.343007 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.343053 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-config-data\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.343220 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.343288 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-scripts\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.343343 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-logs\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.343460 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.343581 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bpm8\" (UniqueName: \"kubernetes.io/projected/83dc9a0f-80b3-4df8-9b1b-f233484cb285-kube-api-access-5bpm8\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.446082 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bpm8\" (UniqueName: \"kubernetes.io/projected/83dc9a0f-80b3-4df8-9b1b-f233484cb285-kube-api-access-5bpm8\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.446210 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.446258 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.446286 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-config-data\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.446335 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.446366 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-scripts\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.446393 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-logs\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.446436 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.446869 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.447802 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.449076 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-logs\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.452984 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-scripts\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.455790 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-config-data\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.455916 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.456334 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.469160 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bpm8\" (UniqueName: \"kubernetes.io/projected/83dc9a0f-80b3-4df8-9b1b-f233484cb285-kube-api-access-5bpm8\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.483140 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.544285 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.803996 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hzcp9"] Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.811888 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hzcp9"] Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.886943 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n6bt8"] Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.888618 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.893108 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rzqbd" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.893369 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.893499 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.894493 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.895096 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.901605 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n6bt8"] Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.961903 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-config-data\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.962044 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzcbt\" (UniqueName: \"kubernetes.io/projected/032ee1ce-8cf1-4acd-a741-dc32f104065f-kube-api-access-vzcbt\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.962089 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-fernet-keys\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.962147 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-combined-ca-bundle\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.962308 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-scripts\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:22 crc kubenswrapper[4903]: I0320 08:45:22.962371 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-credential-keys\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.065377 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-scripts\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.065489 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-credential-keys\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.065624 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-config-data\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.065656 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzcbt\" (UniqueName: \"kubernetes.io/projected/032ee1ce-8cf1-4acd-a741-dc32f104065f-kube-api-access-vzcbt\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.065696 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-fernet-keys\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.065731 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-combined-ca-bundle\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.073568 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-credential-keys\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.073899 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-combined-ca-bundle\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.074809 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-scripts\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.075557 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-fernet-keys\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.081483 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-config-data\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.086567 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzcbt\" (UniqueName: \"kubernetes.io/projected/032ee1ce-8cf1-4acd-a741-dc32f104065f-kube-api-access-vzcbt\") pod \"keystone-bootstrap-n6bt8\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.210649 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.507438 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5652ae-2905-486a-897b-27c2b70bab5d" path="/var/lib/kubelet/pods/0b5652ae-2905-486a-897b-27c2b70bab5d/volumes" Mar 20 08:45:23 crc kubenswrapper[4903]: I0320 08:45:23.508341 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="332e29b8-d365-4ff1-95df-53eaf60e156c" path="/var/lib/kubelet/pods/332e29b8-d365-4ff1-95df-53eaf60e156c/volumes" Mar 20 08:45:28 crc kubenswrapper[4903]: I0320 08:45:28.135377 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" podUID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 20 08:45:30 crc kubenswrapper[4903]: I0320 08:45:30.243722 4903 generic.go:334] "Generic (PLEG): container finished" podID="e64f54bd-2813-41dd-86e6-9836da200d1c" containerID="4f489dcee69a113289e6c636cdf34b9bb86247d0274a95e46fc1731cbaeacb45" exitCode=0 Mar 20 08:45:30 crc kubenswrapper[4903]: I0320 08:45:30.243815 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d8wd9" event={"ID":"e64f54bd-2813-41dd-86e6-9836da200d1c","Type":"ContainerDied","Data":"4f489dcee69a113289e6c636cdf34b9bb86247d0274a95e46fc1731cbaeacb45"} Mar 20 08:45:31 crc kubenswrapper[4903]: E0320 08:45:31.111405 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 20 08:45:31 crc kubenswrapper[4903]: E0320 08:45:31.111686 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cfh75h7fh9bh5b5h668h576h5b4h68h68bh666h9bh4h588h567h94h66h5cdh655hc7h76h79h5bbh54fh5f5h77h5ch5c4h54dh595h589h65cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-djzj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(003c0ace-6aef-4bc2-bc02-358cf140d4ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.117251 4903 scope.go:117] "RemoveContainer" containerID="b87946e7b95c0cbb432479ee2394896543f018b5b04a522866d75dcac1db9682" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.196350 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.282690 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" event={"ID":"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d","Type":"ContainerDied","Data":"0e25dcbb5560c8452824d28bcdb1fadb982de060c5ab358d882ebe5bb0ebfe4b"} Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.282867 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.333470 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-svc\") pod \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.333594 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-swift-storage-0\") pod \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.333665 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-sb\") pod \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.333695 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x6xd\" (UniqueName: \"kubernetes.io/projected/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-kube-api-access-7x6xd\") pod \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.333721 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-config\") pod \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.333751 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-nb\") pod \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\" (UID: \"e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d\") " Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.345683 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-kube-api-access-7x6xd" (OuterVolumeSpecName: "kube-api-access-7x6xd") pod "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" (UID: "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d"). InnerVolumeSpecName "kube-api-access-7x6xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.399194 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" (UID: "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.402994 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" (UID: "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.406889 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" (UID: "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.413421 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-config" (OuterVolumeSpecName: "config") pod "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" (UID: "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.422714 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" (UID: "e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.436153 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.436179 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.436192 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x6xd\" (UniqueName: \"kubernetes.io/projected/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-kube-api-access-7x6xd\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.436201 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.436209 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.436218 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.611925 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mk5xs"] Mar 20 08:45:31 crc kubenswrapper[4903]: I0320 08:45:31.624068 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-mk5xs"] Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.387986 4903 scope.go:117] "RemoveContainer" containerID="85fb607cf7600a38f9e42b0306e55aeba29589e2f87349dc457bfdd31da6c05c" Mar 20 08:45:32 crc kubenswrapper[4903]: E0320 08:45:32.433156 4903 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 08:45:32 crc kubenswrapper[4903]: E0320 08:45:32.434061 4903 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcqt7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-f4rg2_openstack(6e210f8c-e29d-442c-a5eb-ec6b639b0275): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 08:45:32 crc kubenswrapper[4903]: E0320 08:45:32.435199 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-f4rg2" podUID="6e210f8c-e29d-442c-a5eb-ec6b639b0275" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.590597 4903 scope.go:117] "RemoveContainer" containerID="6d611cc60838132e03f73df67cee352686be2318ac7608a108094029b5d324fa" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.706247 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.723465 4903 scope.go:117] "RemoveContainer" containerID="e79c33bcfd08b17843d391b6564c49280f4384378deac752dca59ae3cdecb9a5" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.754202 4903 scope.go:117] "RemoveContainer" containerID="5a912e439af80db8d0d67d630a1b39b6180efc8e1a2e0ef701900df05688fdaf" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.771271 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-combined-ca-bundle\") pod \"e64f54bd-2813-41dd-86e6-9836da200d1c\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.771336 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-config\") pod \"e64f54bd-2813-41dd-86e6-9836da200d1c\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.771399 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrvk2\" (UniqueName: \"kubernetes.io/projected/e64f54bd-2813-41dd-86e6-9836da200d1c-kube-api-access-lrvk2\") pod \"e64f54bd-2813-41dd-86e6-9836da200d1c\" (UID: \"e64f54bd-2813-41dd-86e6-9836da200d1c\") " Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.784851 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64f54bd-2813-41dd-86e6-9836da200d1c-kube-api-access-lrvk2" (OuterVolumeSpecName: "kube-api-access-lrvk2") pod "e64f54bd-2813-41dd-86e6-9836da200d1c" (UID: "e64f54bd-2813-41dd-86e6-9836da200d1c"). InnerVolumeSpecName "kube-api-access-lrvk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.811579 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e64f54bd-2813-41dd-86e6-9836da200d1c" (UID: "e64f54bd-2813-41dd-86e6-9836da200d1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.811601 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-config" (OuterVolumeSpecName: "config") pod "e64f54bd-2813-41dd-86e6-9836da200d1c" (UID: "e64f54bd-2813-41dd-86e6-9836da200d1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.873874 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.873913 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e64f54bd-2813-41dd-86e6-9836da200d1c-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.873923 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrvk2\" (UniqueName: \"kubernetes.io/projected/e64f54bd-2813-41dd-86e6-9836da200d1c-kube-api-access-lrvk2\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:32 crc kubenswrapper[4903]: I0320 08:45:32.981884 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n6bt8"] Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.075152 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.136558 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-mk5xs" podUID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.199785 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:45:33 crc kubenswrapper[4903]: W0320 08:45:33.213012 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod032ee1ce_8cf1_4acd_a741_dc32f104065f.slice/crio-293ecc876fecda6beec6d7096fda21147233b6bbcbd2feba8749b8e4f2501e14 WatchSource:0}: Error finding container 293ecc876fecda6beec6d7096fda21147233b6bbcbd2feba8749b8e4f2501e14: Status 404 returned error can't find the container with id 293ecc876fecda6beec6d7096fda21147233b6bbcbd2feba8749b8e4f2501e14 Mar 20 08:45:33 crc kubenswrapper[4903]: W0320 08:45:33.214539 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83dc9a0f_80b3_4df8_9b1b_f233484cb285.slice/crio-a06414ce03a090d2aa987403746bd9ea9df4b4e399d708dd419cf86b6257a50e WatchSource:0}: Error finding container a06414ce03a090d2aa987403746bd9ea9df4b4e399d708dd419cf86b6257a50e: Status 404 returned error can't find the container with id a06414ce03a090d2aa987403746bd9ea9df4b4e399d708dd419cf86b6257a50e Mar 20 08:45:33 crc kubenswrapper[4903]: W0320 08:45:33.216298 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode761f2dc_041e_4811_802b_2a8e8c376381.slice/crio-a533a7a79244057873761921a865688d4492459313f2c7f8338ba32d85542bff WatchSource:0}: Error finding container a533a7a79244057873761921a865688d4492459313f2c7f8338ba32d85542bff: Status 404 returned error can't find the container with id a533a7a79244057873761921a865688d4492459313f2c7f8338ba32d85542bff Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.326098 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e761f2dc-041e-4811-802b-2a8e8c376381","Type":"ContainerStarted","Data":"a533a7a79244057873761921a865688d4492459313f2c7f8338ba32d85542bff"} Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.329015 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d8wd9" event={"ID":"e64f54bd-2813-41dd-86e6-9836da200d1c","Type":"ContainerDied","Data":"76a4774f95f98ed62297c376206d5783ba9e0c4149b8181875ce05fe96c28d2e"} Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.329061 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76a4774f95f98ed62297c376206d5783ba9e0c4149b8181875ce05fe96c28d2e" Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.329126 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d8wd9" Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.347400 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83dc9a0f-80b3-4df8-9b1b-f233484cb285","Type":"ContainerStarted","Data":"a06414ce03a090d2aa987403746bd9ea9df4b4e399d708dd419cf86b6257a50e"} Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.351761 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xws82" event={"ID":"be07fcc9-d6b5-4551-8846-94aa14b6af5d","Type":"ContainerStarted","Data":"0269026cf7982aeeca2edca95914706d78713f2ca9351fcff90c8fced514cb93"} Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.357879 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ntkfl" event={"ID":"cddb5fee-92f5-463f-a746-8a58e0a05e4b","Type":"ContainerStarted","Data":"4436cc84a4eb5d16d50594541e6aa5ecc7f4b291c369bb3dffbc3aee6052c237"} Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.364591 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n6bt8" event={"ID":"032ee1ce-8cf1-4acd-a741-dc32f104065f","Type":"ContainerStarted","Data":"293ecc876fecda6beec6d7096fda21147233b6bbcbd2feba8749b8e4f2501e14"} Mar 20 08:45:33 crc kubenswrapper[4903]: E0320 08:45:33.381602 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-f4rg2" podUID="6e210f8c-e29d-442c-a5eb-ec6b639b0275" Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.389423 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xws82" podStartSLOduration=2.985435277 podStartE2EDuration="24.389400365s" podCreationTimestamp="2026-03-20 08:45:09 +0000 UTC" firstStartedPulling="2026-03-20 08:45:10.956558131 +0000 UTC m=+1336.173458446" lastFinishedPulling="2026-03-20 08:45:32.360523219 +0000 UTC m=+1357.577423534" observedRunningTime="2026-03-20 08:45:33.375628961 +0000 UTC m=+1358.592529296" watchObservedRunningTime="2026-03-20 08:45:33.389400365 +0000 UTC m=+1358.606300700" Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.427606 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ntkfl" podStartSLOduration=4.106624812 podStartE2EDuration="24.427583871s" podCreationTimestamp="2026-03-20 08:45:09 +0000 UTC" firstStartedPulling="2026-03-20 08:45:10.771246488 +0000 UTC m=+1335.988146803" lastFinishedPulling="2026-03-20 08:45:31.092205547 +0000 UTC m=+1356.309105862" observedRunningTime="2026-03-20 08:45:33.424735972 +0000 UTC m=+1358.641636287" watchObservedRunningTime="2026-03-20 08:45:33.427583871 +0000 UTC m=+1358.644484186" Mar 20 08:45:33 crc kubenswrapper[4903]: I0320 08:45:33.505001 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" path="/var/lib/kubelet/pods/e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d/volumes" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.097233 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nx66r"] Mar 20 08:45:34 crc kubenswrapper[4903]: E0320 08:45:34.097798 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" containerName="init" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.097815 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" containerName="init" Mar 20 08:45:34 crc kubenswrapper[4903]: E0320 08:45:34.097831 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" containerName="dnsmasq-dns" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.097838 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" containerName="dnsmasq-dns" Mar 20 08:45:34 crc kubenswrapper[4903]: E0320 08:45:34.097861 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64f54bd-2813-41dd-86e6-9836da200d1c" containerName="neutron-db-sync" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.097867 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64f54bd-2813-41dd-86e6-9836da200d1c" containerName="neutron-db-sync" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.098086 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64f54bd-2813-41dd-86e6-9836da200d1c" containerName="neutron-db-sync" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.098115 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a5fcc4-d8ea-4cb8-9169-3e428787ab1d" containerName="dnsmasq-dns" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.098929 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.117519 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nx66r"] Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.207970 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7bdb6dfbd4-xpx45"] Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.214997 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.215065 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.215107 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-svc\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.215133 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-config\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.215178 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzpcp\" (UniqueName: \"kubernetes.io/projected/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-kube-api-access-jzpcp\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.215210 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.215468 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.222761 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.222936 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.223066 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.223590 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qpklm" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.248287 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bdb6dfbd4-xpx45"] Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316358 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316406 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316443 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-combined-ca-bundle\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316473 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-svc\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316491 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqlb6\" (UniqueName: \"kubernetes.io/projected/6453cb9a-76a4-412f-9cb8-964c20a217ca-kube-api-access-vqlb6\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316511 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-config\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316530 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-ovndb-tls-certs\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316553 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-httpd-config\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316577 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzpcp\" (UniqueName: \"kubernetes.io/projected/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-kube-api-access-jzpcp\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316608 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-config\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.316626 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.317575 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.318130 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.318610 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.319121 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-svc\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.319607 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-config\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.340740 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzpcp\" (UniqueName: \"kubernetes.io/projected/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-kube-api-access-jzpcp\") pod \"dnsmasq-dns-55f844cf75-nx66r\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.400822 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"003c0ace-6aef-4bc2-bc02-358cf140d4ce","Type":"ContainerStarted","Data":"5f6acda290853c0679341a80d768ac082e84e70b79924dbd01bab6e923434f1e"} Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.404079 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83dc9a0f-80b3-4df8-9b1b-f233484cb285","Type":"ContainerStarted","Data":"3b5d10559c5d44263b5a6e122434a5c7503ceaee8c0fcc56d0ab6b2eb14c36dd"} Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.406771 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n6bt8" event={"ID":"032ee1ce-8cf1-4acd-a741-dc32f104065f","Type":"ContainerStarted","Data":"9c75fa540c0ceff28035719c4c9810d7dcb95337b1bd588ad42d6c195849ec20"} Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.418260 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-combined-ca-bundle\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.418328 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqlb6\" (UniqueName: \"kubernetes.io/projected/6453cb9a-76a4-412f-9cb8-964c20a217ca-kube-api-access-vqlb6\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.418363 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-ovndb-tls-certs\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.418393 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-httpd-config\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.418442 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-config\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.433185 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-combined-ca-bundle\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.433802 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-ovndb-tls-certs\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.440261 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-httpd-config\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.441382 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.460240 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqlb6\" (UniqueName: \"kubernetes.io/projected/6453cb9a-76a4-412f-9cb8-964c20a217ca-kube-api-access-vqlb6\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.461120 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-config\") pod \"neutron-7bdb6dfbd4-xpx45\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.484048 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e761f2dc-041e-4811-802b-2a8e8c376381","Type":"ContainerStarted","Data":"952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c"} Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.505205 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n6bt8" podStartSLOduration=12.505148958 podStartE2EDuration="12.505148958s" podCreationTimestamp="2026-03-20 08:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:34.432447105 +0000 UTC m=+1359.649347420" watchObservedRunningTime="2026-03-20 08:45:34.505148958 +0000 UTC m=+1359.722049283" Mar 20 08:45:34 crc kubenswrapper[4903]: I0320 08:45:34.549647 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:35 crc kubenswrapper[4903]: I0320 08:45:35.022462 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nx66r"] Mar 20 08:45:35 crc kubenswrapper[4903]: W0320 08:45:35.028824 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38b394a1_0e10_4f80_a51b_dcba38ab7cdb.slice/crio-d82aacd5bcece2a02e21d5cfcd0c1519d6019a244bbe5ab61474031050c6d5af WatchSource:0}: Error finding container d82aacd5bcece2a02e21d5cfcd0c1519d6019a244bbe5ab61474031050c6d5af: Status 404 returned error can't find the container with id d82aacd5bcece2a02e21d5cfcd0c1519d6019a244bbe5ab61474031050c6d5af Mar 20 08:45:35 crc kubenswrapper[4903]: I0320 08:45:35.230063 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bdb6dfbd4-xpx45"] Mar 20 08:45:35 crc kubenswrapper[4903]: W0320 08:45:35.246435 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6453cb9a_76a4_412f_9cb8_964c20a217ca.slice/crio-f2fb1e2ccded77e9a488b0947185a52d6b9748c3171c7dff351f3f511ca16cf7 WatchSource:0}: Error finding container f2fb1e2ccded77e9a488b0947185a52d6b9748c3171c7dff351f3f511ca16cf7: Status 404 returned error can't find the container with id f2fb1e2ccded77e9a488b0947185a52d6b9748c3171c7dff351f3f511ca16cf7 Mar 20 08:45:35 crc kubenswrapper[4903]: I0320 08:45:35.508453 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83dc9a0f-80b3-4df8-9b1b-f233484cb285","Type":"ContainerStarted","Data":"40b4b5aa3e9d277b5d7629a27ef27b096fa2b070096c494efac9d95b45d321d0"} Mar 20 08:45:35 crc kubenswrapper[4903]: I0320 08:45:35.508698 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bdb6dfbd4-xpx45" event={"ID":"6453cb9a-76a4-412f-9cb8-964c20a217ca","Type":"ContainerStarted","Data":"f2fb1e2ccded77e9a488b0947185a52d6b9748c3171c7dff351f3f511ca16cf7"} Mar 20 08:45:35 crc kubenswrapper[4903]: I0320 08:45:35.508711 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e761f2dc-041e-4811-802b-2a8e8c376381","Type":"ContainerStarted","Data":"6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5"} Mar 20 08:45:35 crc kubenswrapper[4903]: I0320 08:45:35.519816 4903 generic.go:334] "Generic (PLEG): container finished" podID="38b394a1-0e10-4f80-a51b-dcba38ab7cdb" containerID="d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768" exitCode=0 Mar 20 08:45:35 crc kubenswrapper[4903]: I0320 08:45:35.523311 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" event={"ID":"38b394a1-0e10-4f80-a51b-dcba38ab7cdb","Type":"ContainerDied","Data":"d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768"} Mar 20 08:45:35 crc kubenswrapper[4903]: I0320 08:45:35.523476 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" event={"ID":"38b394a1-0e10-4f80-a51b-dcba38ab7cdb","Type":"ContainerStarted","Data":"d82aacd5bcece2a02e21d5cfcd0c1519d6019a244bbe5ab61474031050c6d5af"} Mar 20 08:45:35 crc kubenswrapper[4903]: I0320 08:45:35.662502 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.662481138 podStartE2EDuration="13.662481138s" podCreationTimestamp="2026-03-20 08:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:35.652138228 +0000 UTC m=+1360.869038543" watchObservedRunningTime="2026-03-20 08:45:35.662481138 +0000 UTC m=+1360.879381453" Mar 20 08:45:35 crc kubenswrapper[4903]: I0320 08:45:35.708805 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.708784361 podStartE2EDuration="16.708784361s" podCreationTimestamp="2026-03-20 08:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:35.707857859 +0000 UTC m=+1360.924758174" watchObservedRunningTime="2026-03-20 08:45:35.708784361 +0000 UTC m=+1360.925684676" Mar 20 08:45:36 crc kubenswrapper[4903]: I0320 08:45:36.557226 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bdb6dfbd4-xpx45" event={"ID":"6453cb9a-76a4-412f-9cb8-964c20a217ca","Type":"ContainerStarted","Data":"48910b95b55fa683cebc5b748d90b941ea8034a5f459c44d84750b8def8b501e"} Mar 20 08:45:36 crc kubenswrapper[4903]: I0320 08:45:36.558787 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bdb6dfbd4-xpx45" event={"ID":"6453cb9a-76a4-412f-9cb8-964c20a217ca","Type":"ContainerStarted","Data":"c8a35f3396f6369e4c3eda1b193db1137374b7218f722935ca859014ff6167e1"} Mar 20 08:45:36 crc kubenswrapper[4903]: I0320 08:45:36.559013 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:45:36 crc kubenswrapper[4903]: I0320 08:45:36.561652 4903 generic.go:334] "Generic (PLEG): container finished" podID="cddb5fee-92f5-463f-a746-8a58e0a05e4b" containerID="4436cc84a4eb5d16d50594541e6aa5ecc7f4b291c369bb3dffbc3aee6052c237" exitCode=0 Mar 20 08:45:36 crc kubenswrapper[4903]: I0320 08:45:36.561696 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ntkfl" event={"ID":"cddb5fee-92f5-463f-a746-8a58e0a05e4b","Type":"ContainerDied","Data":"4436cc84a4eb5d16d50594541e6aa5ecc7f4b291c369bb3dffbc3aee6052c237"} Mar 20 08:45:36 crc kubenswrapper[4903]: I0320 08:45:36.564845 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" event={"ID":"38b394a1-0e10-4f80-a51b-dcba38ab7cdb","Type":"ContainerStarted","Data":"87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696"} Mar 20 08:45:36 crc kubenswrapper[4903]: I0320 08:45:36.565662 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:36 crc kubenswrapper[4903]: I0320 08:45:36.587510 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7bdb6dfbd4-xpx45" podStartSLOduration=2.587488287 podStartE2EDuration="2.587488287s" podCreationTimestamp="2026-03-20 08:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:36.581447131 +0000 UTC m=+1361.798347446" watchObservedRunningTime="2026-03-20 08:45:36.587488287 +0000 UTC m=+1361.804388602" Mar 20 08:45:36 crc kubenswrapper[4903]: I0320 08:45:36.671885 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" podStartSLOduration=2.671859852 podStartE2EDuration="2.671859852s" podCreationTimestamp="2026-03-20 08:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:36.650505635 +0000 UTC m=+1361.867405950" watchObservedRunningTime="2026-03-20 08:45:36.671859852 +0000 UTC m=+1361.888760167" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.161970 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b9df974b5-rb8w6"] Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.167343 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.173552 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.173657 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.192878 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b9df974b5-rb8w6"] Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.202466 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-internal-tls-certs\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.202611 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgvxj\" (UniqueName: \"kubernetes.io/projected/3e64d519-16d2-48d3-8683-9da61bd19e2d-kube-api-access-xgvxj\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.202739 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-ovndb-tls-certs\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.202834 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-combined-ca-bundle\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.202923 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-config\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.203074 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-httpd-config\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.203178 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-public-tls-certs\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.304514 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-httpd-config\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.304594 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-public-tls-certs\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.304639 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-internal-tls-certs\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.304662 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgvxj\" (UniqueName: \"kubernetes.io/projected/3e64d519-16d2-48d3-8683-9da61bd19e2d-kube-api-access-xgvxj\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.304713 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-ovndb-tls-certs\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.304738 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-combined-ca-bundle\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.304765 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-config\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.312266 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-httpd-config\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.312544 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-public-tls-certs\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.320221 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-ovndb-tls-certs\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.322940 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-config\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.323581 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-combined-ca-bundle\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.328902 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-internal-tls-certs\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.333700 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgvxj\" (UniqueName: \"kubernetes.io/projected/3e64d519-16d2-48d3-8683-9da61bd19e2d-kube-api-access-xgvxj\") pod \"neutron-7b9df974b5-rb8w6\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.546001 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.577601 4903 generic.go:334] "Generic (PLEG): container finished" podID="be07fcc9-d6b5-4551-8846-94aa14b6af5d" containerID="0269026cf7982aeeca2edca95914706d78713f2ca9351fcff90c8fced514cb93" exitCode=0 Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.577670 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xws82" event={"ID":"be07fcc9-d6b5-4551-8846-94aa14b6af5d","Type":"ContainerDied","Data":"0269026cf7982aeeca2edca95914706d78713f2ca9351fcff90c8fced514cb93"} Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.580779 4903 generic.go:334] "Generic (PLEG): container finished" podID="032ee1ce-8cf1-4acd-a741-dc32f104065f" containerID="9c75fa540c0ceff28035719c4c9810d7dcb95337b1bd588ad42d6c195849ec20" exitCode=0 Mar 20 08:45:37 crc kubenswrapper[4903]: I0320 08:45:37.583022 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n6bt8" event={"ID":"032ee1ce-8cf1-4acd-a741-dc32f104065f","Type":"ContainerDied","Data":"9c75fa540c0ceff28035719c4c9810d7dcb95337b1bd588ad42d6c195849ec20"} Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.020348 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.135876 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-config-data\") pod \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.136581 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-combined-ca-bundle\") pod \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.136661 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-scripts\") pod \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.136799 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cddb5fee-92f5-463f-a746-8a58e0a05e4b-logs\") pod \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.137060 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mxlf\" (UniqueName: \"kubernetes.io/projected/cddb5fee-92f5-463f-a746-8a58e0a05e4b-kube-api-access-7mxlf\") pod \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.137375 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cddb5fee-92f5-463f-a746-8a58e0a05e4b-logs" (OuterVolumeSpecName: "logs") pod "cddb5fee-92f5-463f-a746-8a58e0a05e4b" (UID: "cddb5fee-92f5-463f-a746-8a58e0a05e4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.137693 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cddb5fee-92f5-463f-a746-8a58e0a05e4b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.144229 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cddb5fee-92f5-463f-a746-8a58e0a05e4b-kube-api-access-7mxlf" (OuterVolumeSpecName: "kube-api-access-7mxlf") pod "cddb5fee-92f5-463f-a746-8a58e0a05e4b" (UID: "cddb5fee-92f5-463f-a746-8a58e0a05e4b"). InnerVolumeSpecName "kube-api-access-7mxlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.150691 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-scripts" (OuterVolumeSpecName: "scripts") pod "cddb5fee-92f5-463f-a746-8a58e0a05e4b" (UID: "cddb5fee-92f5-463f-a746-8a58e0a05e4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:38 crc kubenswrapper[4903]: E0320 08:45:38.173972 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-combined-ca-bundle podName:cddb5fee-92f5-463f-a746-8a58e0a05e4b nodeName:}" failed. No retries permitted until 2026-03-20 08:45:38.673921542 +0000 UTC m=+1363.890821857 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-combined-ca-bundle") pod "cddb5fee-92f5-463f-a746-8a58e0a05e4b" (UID: "cddb5fee-92f5-463f-a746-8a58e0a05e4b") : error deleting /var/lib/kubelet/pods/cddb5fee-92f5-463f-a746-8a58e0a05e4b/volume-subpaths: remove /var/lib/kubelet/pods/cddb5fee-92f5-463f-a746-8a58e0a05e4b/volume-subpaths: no such file or directory Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.176986 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-config-data" (OuterVolumeSpecName: "config-data") pod "cddb5fee-92f5-463f-a746-8a58e0a05e4b" (UID: "cddb5fee-92f5-463f-a746-8a58e0a05e4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.177357 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b9df974b5-rb8w6"] Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.239310 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.239346 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mxlf\" (UniqueName: \"kubernetes.io/projected/cddb5fee-92f5-463f-a746-8a58e0a05e4b-kube-api-access-7mxlf\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.239357 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.599576 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9df974b5-rb8w6" event={"ID":"3e64d519-16d2-48d3-8683-9da61bd19e2d","Type":"ContainerStarted","Data":"34d8004fc6e7c7aa2bef610216ba05203ea0e25d6015c94bd65e30c74e0bc95e"} Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.625272 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ntkfl" event={"ID":"cddb5fee-92f5-463f-a746-8a58e0a05e4b","Type":"ContainerDied","Data":"093eca01661775c1cb2b425f19468c01508af52da1982590f7e538f20fb8093e"} Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.625358 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="093eca01661775c1cb2b425f19468c01508af52da1982590f7e538f20fb8093e" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.625486 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ntkfl" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.751521 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-combined-ca-bundle\") pod \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\" (UID: \"cddb5fee-92f5-463f-a746-8a58e0a05e4b\") " Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.781213 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cddb5fee-92f5-463f-a746-8a58e0a05e4b" (UID: "cddb5fee-92f5-463f-a746-8a58e0a05e4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.846485 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-78d46df76d-rh79h"] Mar 20 08:45:38 crc kubenswrapper[4903]: E0320 08:45:38.847199 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddb5fee-92f5-463f-a746-8a58e0a05e4b" containerName="placement-db-sync" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.847227 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddb5fee-92f5-463f-a746-8a58e0a05e4b" containerName="placement-db-sync" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.847487 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddb5fee-92f5-463f-a746-8a58e0a05e4b" containerName="placement-db-sync" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.848907 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.853718 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cddb5fee-92f5-463f-a746-8a58e0a05e4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.853965 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.854252 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.866462 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78d46df76d-rh79h"] Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.959322 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-logs\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.959391 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-public-tls-certs\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.959476 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-scripts\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.959509 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-combined-ca-bundle\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.960091 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4q9k\" (UniqueName: \"kubernetes.io/projected/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-kube-api-access-c4q9k\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.961471 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-config-data\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:38 crc kubenswrapper[4903]: I0320 08:45:38.961570 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-internal-tls-certs\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.065621 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-logs\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.066193 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-public-tls-certs\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.066250 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-scripts\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.066271 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-combined-ca-bundle\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.066334 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4q9k\" (UniqueName: \"kubernetes.io/projected/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-kube-api-access-c4q9k\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.066398 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-config-data\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.066419 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-internal-tls-certs\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.066120 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-logs\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.072641 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-scripts\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.072803 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-config-data\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.083347 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-internal-tls-certs\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.083565 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-public-tls-certs\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.085671 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4q9k\" (UniqueName: \"kubernetes.io/projected/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-kube-api-access-c4q9k\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.086393 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-combined-ca-bundle\") pod \"placement-78d46df76d-rh79h\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.240896 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.490356 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.490413 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.530095 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.540104 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.633626 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 08:45:39 crc kubenswrapper[4903]: I0320 08:45:39.633931 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.049570 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.087729 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.109004 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-db-sync-config-data\") pod \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.109099 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-fernet-keys\") pod \"032ee1ce-8cf1-4acd-a741-dc32f104065f\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.109139 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-scripts\") pod \"032ee1ce-8cf1-4acd-a741-dc32f104065f\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.109184 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-credential-keys\") pod \"032ee1ce-8cf1-4acd-a741-dc32f104065f\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.109214 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwscl\" (UniqueName: \"kubernetes.io/projected/be07fcc9-d6b5-4551-8846-94aa14b6af5d-kube-api-access-mwscl\") pod \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.109232 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-config-data\") pod \"032ee1ce-8cf1-4acd-a741-dc32f104065f\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.109253 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-combined-ca-bundle\") pod \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\" (UID: \"be07fcc9-d6b5-4551-8846-94aa14b6af5d\") " Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.109278 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-combined-ca-bundle\") pod \"032ee1ce-8cf1-4acd-a741-dc32f104065f\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.109293 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzcbt\" (UniqueName: \"kubernetes.io/projected/032ee1ce-8cf1-4acd-a741-dc32f104065f-kube-api-access-vzcbt\") pod \"032ee1ce-8cf1-4acd-a741-dc32f104065f\" (UID: \"032ee1ce-8cf1-4acd-a741-dc32f104065f\") " Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.117812 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "032ee1ce-8cf1-4acd-a741-dc32f104065f" (UID: "032ee1ce-8cf1-4acd-a741-dc32f104065f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.118541 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-scripts" (OuterVolumeSpecName: "scripts") pod "032ee1ce-8cf1-4acd-a741-dc32f104065f" (UID: "032ee1ce-8cf1-4acd-a741-dc32f104065f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.119635 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be07fcc9-d6b5-4551-8846-94aa14b6af5d-kube-api-access-mwscl" (OuterVolumeSpecName: "kube-api-access-mwscl") pod "be07fcc9-d6b5-4551-8846-94aa14b6af5d" (UID: "be07fcc9-d6b5-4551-8846-94aa14b6af5d"). InnerVolumeSpecName "kube-api-access-mwscl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.124473 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "be07fcc9-d6b5-4551-8846-94aa14b6af5d" (UID: "be07fcc9-d6b5-4551-8846-94aa14b6af5d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.125000 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "032ee1ce-8cf1-4acd-a741-dc32f104065f" (UID: "032ee1ce-8cf1-4acd-a741-dc32f104065f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.132341 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032ee1ce-8cf1-4acd-a741-dc32f104065f-kube-api-access-vzcbt" (OuterVolumeSpecName: "kube-api-access-vzcbt") pod "032ee1ce-8cf1-4acd-a741-dc32f104065f" (UID: "032ee1ce-8cf1-4acd-a741-dc32f104065f"). InnerVolumeSpecName "kube-api-access-vzcbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.173376 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-config-data" (OuterVolumeSpecName: "config-data") pod "032ee1ce-8cf1-4acd-a741-dc32f104065f" (UID: "032ee1ce-8cf1-4acd-a741-dc32f104065f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.175785 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "032ee1ce-8cf1-4acd-a741-dc32f104065f" (UID: "032ee1ce-8cf1-4acd-a741-dc32f104065f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.176218 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be07fcc9-d6b5-4551-8846-94aa14b6af5d" (UID: "be07fcc9-d6b5-4551-8846-94aa14b6af5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.210354 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwscl\" (UniqueName: \"kubernetes.io/projected/be07fcc9-d6b5-4551-8846-94aa14b6af5d-kube-api-access-mwscl\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.210387 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.210398 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.210406 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.210417 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzcbt\" (UniqueName: \"kubernetes.io/projected/032ee1ce-8cf1-4acd-a741-dc32f104065f-kube-api-access-vzcbt\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.210425 4903 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be07fcc9-d6b5-4551-8846-94aa14b6af5d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.210432 4903 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.210440 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.210448 4903 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/032ee1ce-8cf1-4acd-a741-dc32f104065f-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.334416 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78d46df76d-rh79h"] Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.650584 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n6bt8" event={"ID":"032ee1ce-8cf1-4acd-a741-dc32f104065f","Type":"ContainerDied","Data":"293ecc876fecda6beec6d7096fda21147233b6bbcbd2feba8749b8e4f2501e14"} Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.650955 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="293ecc876fecda6beec6d7096fda21147233b6bbcbd2feba8749b8e4f2501e14" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.650861 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n6bt8" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.654309 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"003c0ace-6aef-4bc2-bc02-358cf140d4ce","Type":"ContainerStarted","Data":"f62844ed56c7e0157412ee30b02d4935401b05855b172e4f150a32074e1b5271"} Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.656377 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78d46df76d-rh79h" event={"ID":"edfd7894-cb6d-43bb-87ab-289c00d2a8f7","Type":"ContainerStarted","Data":"cb4f4de67e2cde7b1eced841f79e2032becde10069a1a66d2e124e4fe96dfea5"} Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.656550 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78d46df76d-rh79h" event={"ID":"edfd7894-cb6d-43bb-87ab-289c00d2a8f7","Type":"ContainerStarted","Data":"0353a6ac7bf0dbb9ec86bef39422ab09771546f9a25a1b0e7d76b4e27cd00f31"} Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.658904 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9df974b5-rb8w6" event={"ID":"3e64d519-16d2-48d3-8683-9da61bd19e2d","Type":"ContainerStarted","Data":"a80ef4791d38f5c70e3b66329fee2898f3f9305859e05349a8ed7e0e8454b362"} Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.658940 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9df974b5-rb8w6" event={"ID":"3e64d519-16d2-48d3-8683-9da61bd19e2d","Type":"ContainerStarted","Data":"637459508e1cd8681366e71e955fbcb1a32426e408ddaa26278208437c4f5836"} Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.659255 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.662823 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xws82" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.662851 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xws82" event={"ID":"be07fcc9-d6b5-4551-8846-94aa14b6af5d","Type":"ContainerDied","Data":"5480dc9a3a875b7c03b240c859141e92173003eede89c918d80eb10ff67f9ffe"} Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.662926 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5480dc9a3a875b7c03b240c859141e92173003eede89c918d80eb10ff67f9ffe" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.662832 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.693474 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7b9df974b5-rb8w6" podStartSLOduration=4.6934562589999995 podStartE2EDuration="4.693456259s" podCreationTimestamp="2026-03-20 08:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:41.689431045 +0000 UTC m=+1366.906331390" watchObservedRunningTime="2026-03-20 08:45:41.693456259 +0000 UTC m=+1366.910356574" Mar 20 08:45:41 crc kubenswrapper[4903]: I0320 08:45:41.897571 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.209243 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7f784d4489-rxkmk"] Mar 20 08:45:42 crc kubenswrapper[4903]: E0320 08:45:42.210071 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be07fcc9-d6b5-4551-8846-94aa14b6af5d" containerName="barbican-db-sync" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.210095 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="be07fcc9-d6b5-4551-8846-94aa14b6af5d" containerName="barbican-db-sync" Mar 20 08:45:42 crc kubenswrapper[4903]: E0320 08:45:42.210161 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032ee1ce-8cf1-4acd-a741-dc32f104065f" containerName="keystone-bootstrap" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.210176 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="032ee1ce-8cf1-4acd-a741-dc32f104065f" containerName="keystone-bootstrap" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.210794 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="032ee1ce-8cf1-4acd-a741-dc32f104065f" containerName="keystone-bootstrap" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.210828 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="be07fcc9-d6b5-4551-8846-94aa14b6af5d" containerName="barbican-db-sync" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.213962 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.218106 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.220454 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f784d4489-rxkmk"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.221341 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.221356 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.224997 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.225208 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.228826 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rzqbd" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.279854 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52v8h\" (UniqueName: \"kubernetes.io/projected/c94a513f-1b70-4705-af6c-3f71cb0e4272-kube-api-access-52v8h\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.280554 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-internal-tls-certs\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.280589 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-credential-keys\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.280636 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-scripts\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.280663 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-public-tls-certs\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.280698 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-combined-ca-bundle\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.280757 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-config-data\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.280835 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-fernet-keys\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.382745 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52v8h\" (UniqueName: \"kubernetes.io/projected/c94a513f-1b70-4705-af6c-3f71cb0e4272-kube-api-access-52v8h\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.382815 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-internal-tls-certs\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.382852 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-credential-keys\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.382875 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-scripts\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.382894 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-public-tls-certs\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.382914 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-combined-ca-bundle\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.382952 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-config-data\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.383009 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-fernet-keys\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.395270 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-85dc675685-rvw97"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.398502 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.401566 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-scripts\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.401859 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-fernet-keys\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.415638 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-public-tls-certs\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.416236 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-combined-ca-bundle\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.416598 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-credential-keys\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.417058 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-internal-tls-certs\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.417700 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-pq9qx" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.418261 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.418381 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.423505 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52v8h\" (UniqueName: \"kubernetes.io/projected/c94a513f-1b70-4705-af6c-3f71cb0e4272-kube-api-access-52v8h\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.429656 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-config-data\") pod \"keystone-7f784d4489-rxkmk\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.458507 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-85dc675685-rvw97"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.488302 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data-custom\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.488346 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.488411 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26d1664-feac-422f-a058-7e4f798a7b45-logs\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.488490 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-combined-ca-bundle\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.488524 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcpp5\" (UniqueName: \"kubernetes.io/projected/b26d1664-feac-422f-a058-7e4f798a7b45-kube-api-access-mcpp5\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.494528 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-978f7885d-bj8kg"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.496046 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.504330 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.508444 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-978f7885d-bj8kg"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.544961 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.547339 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.560523 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.582246 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nx66r"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.582552 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" podUID="38b394a1-0e10-4f80-a51b-dcba38ab7cdb" containerName="dnsmasq-dns" containerID="cri-o://87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696" gracePeriod=10 Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.593686 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.594821 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data-custom\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.594854 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-combined-ca-bundle\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.594891 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6r8c\" (UniqueName: \"kubernetes.io/projected/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-kube-api-access-z6r8c\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.594917 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcpp5\" (UniqueName: \"kubernetes.io/projected/b26d1664-feac-422f-a058-7e4f798a7b45-kube-api-access-mcpp5\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.594995 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data-custom\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.595013 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.595093 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-logs\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.595114 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-combined-ca-bundle\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.595139 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26d1664-feac-422f-a058-7e4f798a7b45-logs\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.595189 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.600344 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26d1664-feac-422f-a058-7e4f798a7b45-logs\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.600473 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7f568974ff-6t26g"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.610380 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.624373 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data-custom\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.624858 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-combined-ca-bundle\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.631439 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcpp5\" (UniqueName: \"kubernetes.io/projected/b26d1664-feac-422f-a058-7e4f798a7b45-kube-api-access-mcpp5\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.661355 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.661440 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.662746 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data\") pod \"barbican-worker-85dc675685-rvw97\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.673198 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7f568974ff-6t26g"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.697309 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-logs\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.697636 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data-custom\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.697655 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n689p\" (UniqueName: \"kubernetes.io/projected/ff49346f-602e-46f6-91c7-9c1966535720-kube-api-access-n689p\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.697699 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-combined-ca-bundle\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.697765 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.697794 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.697834 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data-custom\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.697862 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6r8c\" (UniqueName: \"kubernetes.io/projected/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-kube-api-access-z6r8c\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.697890 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff49346f-602e-46f6-91c7-9c1966535720-logs\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.697915 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-combined-ca-bundle\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.698325 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-logs\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.715102 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-combined-ca-bundle\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.715491 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.718391 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.722524 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.727652 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.728529 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data-custom\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.728691 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78d46df76d-rh79h" event={"ID":"edfd7894-cb6d-43bb-87ab-289c00d2a8f7","Type":"ContainerStarted","Data":"f03767fef05a731d00ae668183b5270fada9ddab13c73cdefd1db6d77b290014"} Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.728726 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.728739 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.728763 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.729652 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6r8c\" (UniqueName: \"kubernetes.io/projected/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-kube-api-access-z6r8c\") pod \"barbican-keystone-listener-978f7885d-bj8kg\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.729831 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.767144 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.799260 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data-custom\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.799296 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n689p\" (UniqueName: \"kubernetes.io/projected/ff49346f-602e-46f6-91c7-9c1966535720-kube-api-access-n689p\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.799358 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.799418 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff49346f-602e-46f6-91c7-9c1966535720-logs\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.799456 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-combined-ca-bundle\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.801115 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff49346f-602e-46f6-91c7-9c1966535720-logs\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.802600 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-qfmx5"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.804587 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.813905 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data-custom\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.816791 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.824126 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-combined-ca-bundle\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.834598 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.835316 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.862953 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-qfmx5"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.880085 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75dfbf8d4b-8vdlx"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.894508 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.897316 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.908085 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhddf\" (UniqueName: \"kubernetes.io/projected/31664b72-a142-4656-88e8-84dd0cf18647-kube-api-access-fhddf\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.908188 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31664b72-a142-4656-88e8-84dd0cf18647-logs\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.908234 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data-custom\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.908259 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.908309 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-combined-ca-bundle\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.911652 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n689p\" (UniqueName: \"kubernetes.io/projected/ff49346f-602e-46f6-91c7-9c1966535720-kube-api-access-n689p\") pod \"barbican-worker-7f568974ff-6t26g\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.915786 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.943318 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75dfbf8d4b-8vdlx"] Mar 20 08:45:42 crc kubenswrapper[4903]: I0320 08:45:42.980329 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-78d46df76d-rh79h" podStartSLOduration=4.980302146 podStartE2EDuration="4.980302146s" podCreationTimestamp="2026-03-20 08:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:42.862446103 +0000 UTC m=+1368.079346418" watchObservedRunningTime="2026-03-20 08:45:42.980302146 +0000 UTC m=+1368.197202461" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.010144 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-config\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.010601 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/079887bb-d4be-4d23-bf74-3332bfd2f7cb-logs\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.010768 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f26x5\" (UniqueName: \"kubernetes.io/projected/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-kube-api-access-f26x5\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.010851 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhddf\" (UniqueName: \"kubernetes.io/projected/31664b72-a142-4656-88e8-84dd0cf18647-kube-api-access-fhddf\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.010934 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011071 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31664b72-a142-4656-88e8-84dd0cf18647-logs\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011160 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data-custom\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011230 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011304 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data-custom\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011375 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011451 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-combined-ca-bundle\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011545 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011612 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-svc\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011684 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-combined-ca-bundle\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011754 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfbt2\" (UniqueName: \"kubernetes.io/projected/079887bb-d4be-4d23-bf74-3332bfd2f7cb-kube-api-access-nfbt2\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.011823 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.014532 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31664b72-a142-4656-88e8-84dd0cf18647-logs\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.023909 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-combined-ca-bundle\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.028922 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data-custom\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.032014 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.038569 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhddf\" (UniqueName: \"kubernetes.io/projected/31664b72-a142-4656-88e8-84dd0cf18647-kube-api-access-fhddf\") pod \"barbican-keystone-listener-6d9bbc5dbb-w66cc\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142137 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/079887bb-d4be-4d23-bf74-3332bfd2f7cb-logs\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142205 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f26x5\" (UniqueName: \"kubernetes.io/projected/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-kube-api-access-f26x5\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142241 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142279 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data-custom\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142301 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142355 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142373 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-svc\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142393 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-combined-ca-bundle\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142414 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfbt2\" (UniqueName: \"kubernetes.io/projected/079887bb-d4be-4d23-bf74-3332bfd2f7cb-kube-api-access-nfbt2\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142437 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142461 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-config\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.142713 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/079887bb-d4be-4d23-bf74-3332bfd2f7cb-logs\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.145521 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-config\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.146870 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.149435 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-svc\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.151648 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.164045 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.165262 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-combined-ca-bundle\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.167886 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.168680 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f26x5\" (UniqueName: \"kubernetes.io/projected/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-kube-api-access-f26x5\") pod \"dnsmasq-dns-85ff748b95-qfmx5\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.185059 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfbt2\" (UniqueName: \"kubernetes.io/projected/079887bb-d4be-4d23-bf74-3332bfd2f7cb-kube-api-access-nfbt2\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.199852 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data-custom\") pod \"barbican-api-75dfbf8d4b-8vdlx\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.270606 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.306964 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.328712 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.460206 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f784d4489-rxkmk"] Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.557224 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.569824 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-978f7885d-bj8kg"] Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.614354 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.671326 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-nb\") pod \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.671438 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-sb\") pod \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.671482 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzpcp\" (UniqueName: \"kubernetes.io/projected/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-kube-api-access-jzpcp\") pod \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.671521 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-svc\") pod \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.671542 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-swift-storage-0\") pod \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.671628 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-config\") pod \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\" (UID: \"38b394a1-0e10-4f80-a51b-dcba38ab7cdb\") " Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.702185 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-kube-api-access-jzpcp" (OuterVolumeSpecName: "kube-api-access-jzpcp") pod "38b394a1-0e10-4f80-a51b-dcba38ab7cdb" (UID: "38b394a1-0e10-4f80-a51b-dcba38ab7cdb"). InnerVolumeSpecName "kube-api-access-jzpcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.760862 4903 generic.go:334] "Generic (PLEG): container finished" podID="38b394a1-0e10-4f80-a51b-dcba38ab7cdb" containerID="87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696" exitCode=0 Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.760949 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" event={"ID":"38b394a1-0e10-4f80-a51b-dcba38ab7cdb","Type":"ContainerDied","Data":"87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696"} Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.760978 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" event={"ID":"38b394a1-0e10-4f80-a51b-dcba38ab7cdb","Type":"ContainerDied","Data":"d82aacd5bcece2a02e21d5cfcd0c1519d6019a244bbe5ab61474031050c6d5af"} Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.760994 4903 scope.go:117] "RemoveContainer" containerID="87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.761152 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-nx66r" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.781378 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" event={"ID":"caba59be-ac50-4fe8-9f72-cdbfe69ea01e","Type":"ContainerStarted","Data":"be38e6bbacdb23d274305d0e989655672c7501d23ea80e82cefe7c081ba95581"} Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.803274 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzpcp\" (UniqueName: \"kubernetes.io/projected/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-kube-api-access-jzpcp\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.813667 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-85dc675685-rvw97"] Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.823596 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f784d4489-rxkmk" event={"ID":"c94a513f-1b70-4705-af6c-3f71cb0e4272","Type":"ContainerStarted","Data":"f7353b6ff942ff4f204ab5c2d464ab1ef4b78615acf0084c110ec9caf9ca2f33"} Mar 20 08:45:43 crc kubenswrapper[4903]: W0320 08:45:43.848740 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb26d1664_feac_422f_a058_7e4f798a7b45.slice/crio-9851d6d449b12b8ac66744736b665befadc8f15733403ebf4932876146122525 WatchSource:0}: Error finding container 9851d6d449b12b8ac66744736b665befadc8f15733403ebf4932876146122525: Status 404 returned error can't find the container with id 9851d6d449b12b8ac66744736b665befadc8f15733403ebf4932876146122525 Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.906553 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-qfmx5"] Mar 20 08:45:43 crc kubenswrapper[4903]: I0320 08:45:43.911482 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7f568974ff-6t26g"] Mar 20 08:45:43 crc kubenswrapper[4903]: W0320 08:45:43.946447 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff49346f_602e_46f6_91c7_9c1966535720.slice/crio-308c4a6d0714cf150a805fb224c9dececa0d7e873ed3e8018dba7f6a88b23ec4 WatchSource:0}: Error finding container 308c4a6d0714cf150a805fb224c9dececa0d7e873ed3e8018dba7f6a88b23ec4: Status 404 returned error can't find the container with id 308c4a6d0714cf150a805fb224c9dececa0d7e873ed3e8018dba7f6a88b23ec4 Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.149192 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38b394a1-0e10-4f80-a51b-dcba38ab7cdb" (UID: "38b394a1-0e10-4f80-a51b-dcba38ab7cdb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.156573 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75dfbf8d4b-8vdlx"] Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.169397 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc"] Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.214709 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.229254 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-config" (OuterVolumeSpecName: "config") pod "38b394a1-0e10-4f80-a51b-dcba38ab7cdb" (UID: "38b394a1-0e10-4f80-a51b-dcba38ab7cdb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.241731 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38b394a1-0e10-4f80-a51b-dcba38ab7cdb" (UID: "38b394a1-0e10-4f80-a51b-dcba38ab7cdb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.296460 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38b394a1-0e10-4f80-a51b-dcba38ab7cdb" (UID: "38b394a1-0e10-4f80-a51b-dcba38ab7cdb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.320174 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.320206 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.320215 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.345116 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38b394a1-0e10-4f80-a51b-dcba38ab7cdb" (UID: "38b394a1-0e10-4f80-a51b-dcba38ab7cdb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.425089 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38b394a1-0e10-4f80-a51b-dcba38ab7cdb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.579103 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nx66r"] Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.595901 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-nx66r"] Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.622193 4903 scope.go:117] "RemoveContainer" containerID="d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.693083 4903 scope.go:117] "RemoveContainer" containerID="87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696" Mar 20 08:45:44 crc kubenswrapper[4903]: E0320 08:45:44.694447 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696\": container with ID starting with 87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696 not found: ID does not exist" containerID="87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.694481 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696"} err="failed to get container status \"87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696\": rpc error: code = NotFound desc = could not find container \"87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696\": container with ID starting with 87812fff190c2b49f33c39720b0c3df9d23cac42e6cb81120b89310adfbe3696 not found: ID does not exist" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.694504 4903 scope.go:117] "RemoveContainer" containerID="d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768" Mar 20 08:45:44 crc kubenswrapper[4903]: E0320 08:45:44.695068 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768\": container with ID starting with d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768 not found: ID does not exist" containerID="d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.695093 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768"} err="failed to get container status \"d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768\": rpc error: code = NotFound desc = could not find container \"d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768\": container with ID starting with d6185e72ddaacddd9e4bdc3bc95448c9abb3f17971001b90c2dcd197ffc3c768 not found: ID does not exist" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.828907 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f568974ff-6t26g" event={"ID":"ff49346f-602e-46f6-91c7-9c1966535720","Type":"ContainerStarted","Data":"308c4a6d0714cf150a805fb224c9dececa0d7e873ed3e8018dba7f6a88b23ec4"} Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.831022 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" event={"ID":"079887bb-d4be-4d23-bf74-3332bfd2f7cb","Type":"ContainerStarted","Data":"7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4"} Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.831081 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" event={"ID":"079887bb-d4be-4d23-bf74-3332bfd2f7cb","Type":"ContainerStarted","Data":"bc9fc1a199781868d3417d58ed6a80f04d5ca374cb5aae24187974996eb51bd1"} Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.834384 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85dc675685-rvw97" event={"ID":"b26d1664-feac-422f-a058-7e4f798a7b45","Type":"ContainerStarted","Data":"9851d6d449b12b8ac66744736b665befadc8f15733403ebf4932876146122525"} Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.858131 4903 generic.go:334] "Generic (PLEG): container finished" podID="8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" containerID="a3bb6d417d676ee314560e777802fd468275b6c788eae6b960a25872b3a735de" exitCode=0 Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.859108 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" event={"ID":"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58","Type":"ContainerDied","Data":"a3bb6d417d676ee314560e777802fd468275b6c788eae6b960a25872b3a735de"} Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.859139 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" event={"ID":"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58","Type":"ContainerStarted","Data":"d73153e2593a92dafe655325ce26f0dde10c73923a32d2d03f31a40267ba46a5"} Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.863357 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" event={"ID":"31664b72-a142-4656-88e8-84dd0cf18647","Type":"ContainerStarted","Data":"dcbf502e2e9ffd196348392f163aea90c5b00bdbbb5efbcca5b67eb7642dc7c8"} Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.913818 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.913841 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.915222 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f784d4489-rxkmk" event={"ID":"c94a513f-1b70-4705-af6c-3f71cb0e4272","Type":"ContainerStarted","Data":"9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d"} Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.915261 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:45:44 crc kubenswrapper[4903]: I0320 08:45:44.940934 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7f784d4489-rxkmk" podStartSLOduration=2.940906973 podStartE2EDuration="2.940906973s" podCreationTimestamp="2026-03-20 08:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:44.939751716 +0000 UTC m=+1370.156652031" watchObservedRunningTime="2026-03-20 08:45:44.940906973 +0000 UTC m=+1370.157807298" Mar 20 08:45:45 crc kubenswrapper[4903]: I0320 08:45:45.566756 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b394a1-0e10-4f80-a51b-dcba38ab7cdb" path="/var/lib/kubelet/pods/38b394a1-0e10-4f80-a51b-dcba38ab7cdb/volumes" Mar 20 08:45:45 crc kubenswrapper[4903]: I0320 08:45:45.744389 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:45 crc kubenswrapper[4903]: I0320 08:45:45.879341 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 08:45:45 crc kubenswrapper[4903]: I0320 08:45:45.929095 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" event={"ID":"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58","Type":"ContainerStarted","Data":"4917ba93045126a4576095ac35dd6c5a6ea23ae385611d7fc0c26007e0dde166"} Mar 20 08:45:45 crc kubenswrapper[4903]: I0320 08:45:45.930198 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:45 crc kubenswrapper[4903]: I0320 08:45:45.942614 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" event={"ID":"079887bb-d4be-4d23-bf74-3332bfd2f7cb","Type":"ContainerStarted","Data":"e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7"} Mar 20 08:45:45 crc kubenswrapper[4903]: I0320 08:45:45.942805 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:45 crc kubenswrapper[4903]: I0320 08:45:45.970486 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" podStartSLOduration=3.970471743 podStartE2EDuration="3.970471743s" podCreationTimestamp="2026-03-20 08:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:45.9694711 +0000 UTC m=+1371.186371415" watchObservedRunningTime="2026-03-20 08:45:45.970471743 +0000 UTC m=+1371.187372058" Mar 20 08:45:46 crc kubenswrapper[4903]: I0320 08:45:46.035172 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" podStartSLOduration=4.035147219 podStartE2EDuration="4.035147219s" podCreationTimestamp="2026-03-20 08:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:46.019940515 +0000 UTC m=+1371.236840830" watchObservedRunningTime="2026-03-20 08:45:46.035147219 +0000 UTC m=+1371.252047534" Mar 20 08:45:46 crc kubenswrapper[4903]: I0320 08:45:46.962372 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f4rg2" event={"ID":"6e210f8c-e29d-442c-a5eb-ec6b639b0275","Type":"ContainerStarted","Data":"f1585936de585b08572047adfaea7713e1e4f1a92c343c05347a48965f635c40"} Mar 20 08:45:46 crc kubenswrapper[4903]: I0320 08:45:46.964485 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.000498 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-f4rg2" podStartSLOduration=3.324130843 podStartE2EDuration="38.000471673s" podCreationTimestamp="2026-03-20 08:45:09 +0000 UTC" firstStartedPulling="2026-03-20 08:45:10.582751227 +0000 UTC m=+1335.799651542" lastFinishedPulling="2026-03-20 08:45:45.259092057 +0000 UTC m=+1370.475992372" observedRunningTime="2026-03-20 08:45:46.984184494 +0000 UTC m=+1372.201084809" watchObservedRunningTime="2026-03-20 08:45:47.000471673 +0000 UTC m=+1372.217371988" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.040155 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5895dcfdfd-4gs9b"] Mar 20 08:45:47 crc kubenswrapper[4903]: E0320 08:45:47.040544 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b394a1-0e10-4f80-a51b-dcba38ab7cdb" containerName="dnsmasq-dns" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.040560 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b394a1-0e10-4f80-a51b-dcba38ab7cdb" containerName="dnsmasq-dns" Mar 20 08:45:47 crc kubenswrapper[4903]: E0320 08:45:47.040584 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b394a1-0e10-4f80-a51b-dcba38ab7cdb" containerName="init" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.040590 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b394a1-0e10-4f80-a51b-dcba38ab7cdb" containerName="init" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.040799 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b394a1-0e10-4f80-a51b-dcba38ab7cdb" containerName="dnsmasq-dns" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.044290 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.047577 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.047873 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.082690 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5895dcfdfd-4gs9b"] Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.110501 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.111078 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-logs\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.113640 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-public-tls-certs\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.113833 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-internal-tls-certs\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.113940 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-857hw\" (UniqueName: \"kubernetes.io/projected/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-kube-api-access-857hw\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.114140 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data-custom\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.114250 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-combined-ca-bundle\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.216374 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-857hw\" (UniqueName: \"kubernetes.io/projected/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-kube-api-access-857hw\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.216483 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data-custom\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.216514 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-combined-ca-bundle\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.216557 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.216611 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-logs\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.216676 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-public-tls-certs\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.216736 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-internal-tls-certs\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.218100 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-logs\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.224483 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-public-tls-certs\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.225221 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.232274 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-internal-tls-certs\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.239259 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-857hw\" (UniqueName: \"kubernetes.io/projected/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-kube-api-access-857hw\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.239946 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-combined-ca-bundle\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.267064 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data-custom\") pod \"barbican-api-5895dcfdfd-4gs9b\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:47 crc kubenswrapper[4903]: I0320 08:45:47.394680 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:48 crc kubenswrapper[4903]: I0320 08:45:48.232827 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5895dcfdfd-4gs9b"] Mar 20 08:45:48 crc kubenswrapper[4903]: W0320 08:45:48.247783 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbae081ad_c434_4d1b_ae9f_cc8c5b6e2f33.slice/crio-d4cdfb61d9e1894db3ded5c41d5370a86645d505f69eb543c4925c75d51df735 WatchSource:0}: Error finding container d4cdfb61d9e1894db3ded5c41d5370a86645d505f69eb543c4925c75d51df735: Status 404 returned error can't find the container with id d4cdfb61d9e1894db3ded5c41d5370a86645d505f69eb543c4925c75d51df735 Mar 20 08:45:48 crc kubenswrapper[4903]: I0320 08:45:48.998862 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5895dcfdfd-4gs9b" event={"ID":"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33","Type":"ContainerStarted","Data":"399861c0aed674ac17caf4a76e74022608a67747c7b5e6718fd6d0bf4376c5d8"} Mar 20 08:45:48 crc kubenswrapper[4903]: I0320 08:45:48.999201 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5895dcfdfd-4gs9b" event={"ID":"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33","Type":"ContainerStarted","Data":"93597afa34681cad8c7e33fa9d9d2b8edff2db1b4f63723973a392f7d94f6d4d"} Mar 20 08:45:48 crc kubenswrapper[4903]: I0320 08:45:48.999227 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5895dcfdfd-4gs9b" event={"ID":"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33","Type":"ContainerStarted","Data":"d4cdfb61d9e1894db3ded5c41d5370a86645d505f69eb543c4925c75d51df735"} Mar 20 08:45:48 crc kubenswrapper[4903]: I0320 08:45:48.999444 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:48 crc kubenswrapper[4903]: I0320 08:45:48.999489 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.006307 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" event={"ID":"caba59be-ac50-4fe8-9f72-cdbfe69ea01e","Type":"ContainerStarted","Data":"55fb6d2fd6341ee5bac702736ca4d1e5b2e13663c5dcc18d63b378db2e861cf4"} Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.006380 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" event={"ID":"caba59be-ac50-4fe8-9f72-cdbfe69ea01e","Type":"ContainerStarted","Data":"2c9148d1bb02f27366574086b03369f7ecf88f1aafded90a10ec16e0003aa710"} Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.015130 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" event={"ID":"31664b72-a142-4656-88e8-84dd0cf18647","Type":"ContainerStarted","Data":"2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5"} Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.015188 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" event={"ID":"31664b72-a142-4656-88e8-84dd0cf18647","Type":"ContainerStarted","Data":"34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8"} Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.021926 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f568974ff-6t26g" event={"ID":"ff49346f-602e-46f6-91c7-9c1966535720","Type":"ContainerStarted","Data":"386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2"} Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.021992 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f568974ff-6t26g" event={"ID":"ff49346f-602e-46f6-91c7-9c1966535720","Type":"ContainerStarted","Data":"87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b"} Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.033054 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85dc675685-rvw97" event={"ID":"b26d1664-feac-422f-a058-7e4f798a7b45","Type":"ContainerStarted","Data":"c06c433ae6ac013c7917e163ec8311d67c98a819ad305bbd253a57a34b1ed35e"} Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.033104 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85dc675685-rvw97" event={"ID":"b26d1664-feac-422f-a058-7e4f798a7b45","Type":"ContainerStarted","Data":"8a6cd303bb413a19cc42c41d551839a5c95988228323fc457e9b609cb5d52414"} Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.058576 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7f568974ff-6t26g" podStartSLOduration=3.283324582 podStartE2EDuration="7.058540589s" podCreationTimestamp="2026-03-20 08:45:42 +0000 UTC" firstStartedPulling="2026-03-20 08:45:43.949735666 +0000 UTC m=+1369.166635981" lastFinishedPulling="2026-03-20 08:45:47.724951673 +0000 UTC m=+1372.941851988" observedRunningTime="2026-03-20 08:45:49.051502695 +0000 UTC m=+1374.268403020" watchObservedRunningTime="2026-03-20 08:45:49.058540589 +0000 UTC m=+1374.275440904" Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.064741 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5895dcfdfd-4gs9b" podStartSLOduration=2.064712642 podStartE2EDuration="2.064712642s" podCreationTimestamp="2026-03-20 08:45:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:45:49.027692881 +0000 UTC m=+1374.244593196" watchObservedRunningTime="2026-03-20 08:45:49.064712642 +0000 UTC m=+1374.281612957" Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.078614 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" podStartSLOduration=2.988744386 podStartE2EDuration="7.078597835s" podCreationTimestamp="2026-03-20 08:45:42 +0000 UTC" firstStartedPulling="2026-03-20 08:45:43.624808115 +0000 UTC m=+1368.841708430" lastFinishedPulling="2026-03-20 08:45:47.714661564 +0000 UTC m=+1372.931561879" observedRunningTime="2026-03-20 08:45:49.073983348 +0000 UTC m=+1374.290883663" watchObservedRunningTime="2026-03-20 08:45:49.078597835 +0000 UTC m=+1374.295498150" Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.107099 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-85dc675685-rvw97"] Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.112921 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" podStartSLOduration=3.651384947 podStartE2EDuration="7.112904914s" podCreationTimestamp="2026-03-20 08:45:42 +0000 UTC" firstStartedPulling="2026-03-20 08:45:44.25327323 +0000 UTC m=+1369.470173545" lastFinishedPulling="2026-03-20 08:45:47.714793197 +0000 UTC m=+1372.931693512" observedRunningTime="2026-03-20 08:45:49.107704523 +0000 UTC m=+1374.324604838" watchObservedRunningTime="2026-03-20 08:45:49.112904914 +0000 UTC m=+1374.329805229" Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.135814 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-85dc675685-rvw97" podStartSLOduration=3.299532798 podStartE2EDuration="7.135795096s" podCreationTimestamp="2026-03-20 08:45:42 +0000 UTC" firstStartedPulling="2026-03-20 08:45:43.879651465 +0000 UTC m=+1369.096551780" lastFinishedPulling="2026-03-20 08:45:47.715913763 +0000 UTC m=+1372.932814078" observedRunningTime="2026-03-20 08:45:49.125943507 +0000 UTC m=+1374.342843822" watchObservedRunningTime="2026-03-20 08:45:49.135795096 +0000 UTC m=+1374.352695411" Mar 20 08:45:49 crc kubenswrapper[4903]: I0320 08:45:49.150984 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-978f7885d-bj8kg"] Mar 20 08:45:50 crc kubenswrapper[4903]: I0320 08:45:50.833550 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:45:50 crc kubenswrapper[4903]: I0320 08:45:50.834274 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:45:50 crc kubenswrapper[4903]: I0320 08:45:50.834327 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:45:50 crc kubenswrapper[4903]: I0320 08:45:50.835181 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7094e5f77c270dc626be780c469f09df2b6b5e5f309bca7fa5e8149bdd6f3199"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:45:50 crc kubenswrapper[4903]: I0320 08:45:50.835257 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://7094e5f77c270dc626be780c469f09df2b6b5e5f309bca7fa5e8149bdd6f3199" gracePeriod=600 Mar 20 08:45:51 crc kubenswrapper[4903]: I0320 08:45:51.054629 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="7094e5f77c270dc626be780c469f09df2b6b5e5f309bca7fa5e8149bdd6f3199" exitCode=0 Mar 20 08:45:51 crc kubenswrapper[4903]: I0320 08:45:51.054938 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-85dc675685-rvw97" podUID="b26d1664-feac-422f-a058-7e4f798a7b45" containerName="barbican-worker-log" containerID="cri-o://8a6cd303bb413a19cc42c41d551839a5c95988228323fc457e9b609cb5d52414" gracePeriod=30 Mar 20 08:45:51 crc kubenswrapper[4903]: I0320 08:45:51.055312 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"7094e5f77c270dc626be780c469f09df2b6b5e5f309bca7fa5e8149bdd6f3199"} Mar 20 08:45:51 crc kubenswrapper[4903]: I0320 08:45:51.055404 4903 scope.go:117] "RemoveContainer" containerID="0c24277c1ea9806e81aa4981e7afee5bd67d933c16e4a264dbdf97f39e69ac1c" Mar 20 08:45:51 crc kubenswrapper[4903]: I0320 08:45:51.055701 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" podUID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" containerName="barbican-keystone-listener-log" containerID="cri-o://2c9148d1bb02f27366574086b03369f7ecf88f1aafded90a10ec16e0003aa710" gracePeriod=30 Mar 20 08:45:51 crc kubenswrapper[4903]: I0320 08:45:51.056295 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" podUID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" containerName="barbican-keystone-listener" containerID="cri-o://55fb6d2fd6341ee5bac702736ca4d1e5b2e13663c5dcc18d63b378db2e861cf4" gracePeriod=30 Mar 20 08:45:51 crc kubenswrapper[4903]: I0320 08:45:51.056214 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-85dc675685-rvw97" podUID="b26d1664-feac-422f-a058-7e4f798a7b45" containerName="barbican-worker" containerID="cri-o://c06c433ae6ac013c7917e163ec8311d67c98a819ad305bbd253a57a34b1ed35e" gracePeriod=30 Mar 20 08:45:52 crc kubenswrapper[4903]: I0320 08:45:52.069517 4903 generic.go:334] "Generic (PLEG): container finished" podID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" containerID="2c9148d1bb02f27366574086b03369f7ecf88f1aafded90a10ec16e0003aa710" exitCode=143 Mar 20 08:45:52 crc kubenswrapper[4903]: I0320 08:45:52.069612 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" event={"ID":"caba59be-ac50-4fe8-9f72-cdbfe69ea01e","Type":"ContainerDied","Data":"2c9148d1bb02f27366574086b03369f7ecf88f1aafded90a10ec16e0003aa710"} Mar 20 08:45:52 crc kubenswrapper[4903]: I0320 08:45:52.074281 4903 generic.go:334] "Generic (PLEG): container finished" podID="b26d1664-feac-422f-a058-7e4f798a7b45" containerID="c06c433ae6ac013c7917e163ec8311d67c98a819ad305bbd253a57a34b1ed35e" exitCode=0 Mar 20 08:45:52 crc kubenswrapper[4903]: I0320 08:45:52.074319 4903 generic.go:334] "Generic (PLEG): container finished" podID="b26d1664-feac-422f-a058-7e4f798a7b45" containerID="8a6cd303bb413a19cc42c41d551839a5c95988228323fc457e9b609cb5d52414" exitCode=143 Mar 20 08:45:52 crc kubenswrapper[4903]: I0320 08:45:52.074370 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85dc675685-rvw97" event={"ID":"b26d1664-feac-422f-a058-7e4f798a7b45","Type":"ContainerDied","Data":"c06c433ae6ac013c7917e163ec8311d67c98a819ad305bbd253a57a34b1ed35e"} Mar 20 08:45:52 crc kubenswrapper[4903]: I0320 08:45:52.074416 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85dc675685-rvw97" event={"ID":"b26d1664-feac-422f-a058-7e4f798a7b45","Type":"ContainerDied","Data":"8a6cd303bb413a19cc42c41d551839a5c95988228323fc457e9b609cb5d52414"} Mar 20 08:45:52 crc kubenswrapper[4903]: I0320 08:45:52.077396 4903 generic.go:334] "Generic (PLEG): container finished" podID="6e210f8c-e29d-442c-a5eb-ec6b639b0275" containerID="f1585936de585b08572047adfaea7713e1e4f1a92c343c05347a48965f635c40" exitCode=0 Mar 20 08:45:52 crc kubenswrapper[4903]: I0320 08:45:52.077440 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f4rg2" event={"ID":"6e210f8c-e29d-442c-a5eb-ec6b639b0275","Type":"ContainerDied","Data":"f1585936de585b08572047adfaea7713e1e4f1a92c343c05347a48965f635c40"} Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.104563 4903 generic.go:334] "Generic (PLEG): container finished" podID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" containerID="55fb6d2fd6341ee5bac702736ca4d1e5b2e13663c5dcc18d63b378db2e861cf4" exitCode=0 Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.104798 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" event={"ID":"caba59be-ac50-4fe8-9f72-cdbfe69ea01e","Type":"ContainerDied","Data":"55fb6d2fd6341ee5bac702736ca4d1e5b2e13663c5dcc18d63b378db2e861cf4"} Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.310241 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.381446 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl9ms"] Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.381785 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" podUID="b95e8341-9f65-461f-891b-ec6512be57f7" containerName="dnsmasq-dns" containerID="cri-o://49c1a0f1b9ba0addd74e68afa439b435d66004b7b86a4a212551513668453686" gracePeriod=10 Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.594621 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.699157 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data\") pod \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.699449 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6r8c\" (UniqueName: \"kubernetes.io/projected/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-kube-api-access-z6r8c\") pod \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.699535 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data-custom\") pod \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.699557 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-combined-ca-bundle\") pod \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.699648 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-logs\") pod \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\" (UID: \"caba59be-ac50-4fe8-9f72-cdbfe69ea01e\") " Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.700536 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-logs" (OuterVolumeSpecName: "logs") pod "caba59be-ac50-4fe8-9f72-cdbfe69ea01e" (UID: "caba59be-ac50-4fe8-9f72-cdbfe69ea01e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.708407 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-kube-api-access-z6r8c" (OuterVolumeSpecName: "kube-api-access-z6r8c") pod "caba59be-ac50-4fe8-9f72-cdbfe69ea01e" (UID: "caba59be-ac50-4fe8-9f72-cdbfe69ea01e"). InnerVolumeSpecName "kube-api-access-z6r8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.714875 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "caba59be-ac50-4fe8-9f72-cdbfe69ea01e" (UID: "caba59be-ac50-4fe8-9f72-cdbfe69ea01e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.780180 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "caba59be-ac50-4fe8-9f72-cdbfe69ea01e" (UID: "caba59be-ac50-4fe8-9f72-cdbfe69ea01e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.789827 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data" (OuterVolumeSpecName: "config-data") pod "caba59be-ac50-4fe8-9f72-cdbfe69ea01e" (UID: "caba59be-ac50-4fe8-9f72-cdbfe69ea01e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.813856 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.813890 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.813903 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6r8c\" (UniqueName: \"kubernetes.io/projected/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-kube-api-access-z6r8c\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.813915 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:53 crc kubenswrapper[4903]: I0320 08:45:53.813929 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caba59be-ac50-4fe8-9f72-cdbfe69ea01e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:54 crc kubenswrapper[4903]: I0320 08:45:54.119569 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" event={"ID":"caba59be-ac50-4fe8-9f72-cdbfe69ea01e","Type":"ContainerDied","Data":"be38e6bbacdb23d274305d0e989655672c7501d23ea80e82cefe7c081ba95581"} Mar 20 08:45:54 crc kubenswrapper[4903]: I0320 08:45:54.119630 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-978f7885d-bj8kg" Mar 20 08:45:54 crc kubenswrapper[4903]: I0320 08:45:54.121925 4903 generic.go:334] "Generic (PLEG): container finished" podID="b95e8341-9f65-461f-891b-ec6512be57f7" containerID="49c1a0f1b9ba0addd74e68afa439b435d66004b7b86a4a212551513668453686" exitCode=0 Mar 20 08:45:54 crc kubenswrapper[4903]: I0320 08:45:54.121952 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" event={"ID":"b95e8341-9f65-461f-891b-ec6512be57f7","Type":"ContainerDied","Data":"49c1a0f1b9ba0addd74e68afa439b435d66004b7b86a4a212551513668453686"} Mar 20 08:45:54 crc kubenswrapper[4903]: I0320 08:45:54.172966 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-978f7885d-bj8kg"] Mar 20 08:45:54 crc kubenswrapper[4903]: I0320 08:45:54.180993 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-978f7885d-bj8kg"] Mar 20 08:45:54 crc kubenswrapper[4903]: I0320 08:45:54.903839 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.033749 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.114499 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.117649 4903 scope.go:117] "RemoveContainer" containerID="55fb6d2fd6341ee5bac702736ca4d1e5b2e13663c5dcc18d63b378db2e861cf4" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.134280 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.151873 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85dc675685-rvw97" event={"ID":"b26d1664-feac-422f-a058-7e4f798a7b45","Type":"ContainerDied","Data":"9851d6d449b12b8ac66744736b665befadc8f15733403ebf4932876146122525"} Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.151967 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.161456 4903 scope.go:117] "RemoveContainer" containerID="2c9148d1bb02f27366574086b03369f7ecf88f1aafded90a10ec16e0003aa710" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.163444 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f4rg2" event={"ID":"6e210f8c-e29d-442c-a5eb-ec6b639b0275","Type":"ContainerDied","Data":"b87dcd8ea23a18ff428c9a97b4ca1d761b85d9e4ce0359216dcf159ca440b6e2"} Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.163476 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b87dcd8ea23a18ff428c9a97b4ca1d761b85d9e4ce0359216dcf159ca440b6e2" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.163490 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f4rg2" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.194249 4903 scope.go:117] "RemoveContainer" containerID="c06c433ae6ac013c7917e163ec8311d67c98a819ad305bbd253a57a34b1ed35e" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241215 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-db-sync-config-data\") pod \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241290 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data-custom\") pod \"b26d1664-feac-422f-a058-7e4f798a7b45\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241347 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcqt7\" (UniqueName: \"kubernetes.io/projected/6e210f8c-e29d-442c-a5eb-ec6b639b0275-kube-api-access-kcqt7\") pod \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241418 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-scripts\") pod \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241580 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-combined-ca-bundle\") pod \"b26d1664-feac-422f-a058-7e4f798a7b45\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241647 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-config-data\") pod \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241702 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e210f8c-e29d-442c-a5eb-ec6b639b0275-etc-machine-id\") pod \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241728 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcpp5\" (UniqueName: \"kubernetes.io/projected/b26d1664-feac-422f-a058-7e4f798a7b45-kube-api-access-mcpp5\") pod \"b26d1664-feac-422f-a058-7e4f798a7b45\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241757 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26d1664-feac-422f-a058-7e4f798a7b45-logs\") pod \"b26d1664-feac-422f-a058-7e4f798a7b45\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241850 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data\") pod \"b26d1664-feac-422f-a058-7e4f798a7b45\" (UID: \"b26d1664-feac-422f-a058-7e4f798a7b45\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.241874 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-combined-ca-bundle\") pod \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\" (UID: \"6e210f8c-e29d-442c-a5eb-ec6b639b0275\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.242925 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e210f8c-e29d-442c-a5eb-ec6b639b0275-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6e210f8c-e29d-442c-a5eb-ec6b639b0275" (UID: "6e210f8c-e29d-442c-a5eb-ec6b639b0275"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.244705 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b26d1664-feac-422f-a058-7e4f798a7b45-logs" (OuterVolumeSpecName: "logs") pod "b26d1664-feac-422f-a058-7e4f798a7b45" (UID: "b26d1664-feac-422f-a058-7e4f798a7b45"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.254208 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b26d1664-feac-422f-a058-7e4f798a7b45" (UID: "b26d1664-feac-422f-a058-7e4f798a7b45"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.264138 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e210f8c-e29d-442c-a5eb-ec6b639b0275-kube-api-access-kcqt7" (OuterVolumeSpecName: "kube-api-access-kcqt7") pod "6e210f8c-e29d-442c-a5eb-ec6b639b0275" (UID: "6e210f8c-e29d-442c-a5eb-ec6b639b0275"). InnerVolumeSpecName "kube-api-access-kcqt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.264205 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-scripts" (OuterVolumeSpecName: "scripts") pod "6e210f8c-e29d-442c-a5eb-ec6b639b0275" (UID: "6e210f8c-e29d-442c-a5eb-ec6b639b0275"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.270542 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6e210f8c-e29d-442c-a5eb-ec6b639b0275" (UID: "6e210f8c-e29d-442c-a5eb-ec6b639b0275"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.272743 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26d1664-feac-422f-a058-7e4f798a7b45-kube-api-access-mcpp5" (OuterVolumeSpecName: "kube-api-access-mcpp5") pod "b26d1664-feac-422f-a058-7e4f798a7b45" (UID: "b26d1664-feac-422f-a058-7e4f798a7b45"). InnerVolumeSpecName "kube-api-access-mcpp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.304940 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e210f8c-e29d-442c-a5eb-ec6b639b0275" (UID: "6e210f8c-e29d-442c-a5eb-ec6b639b0275"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.336089 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data" (OuterVolumeSpecName: "config-data") pod "b26d1664-feac-422f-a058-7e4f798a7b45" (UID: "b26d1664-feac-422f-a058-7e4f798a7b45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.343835 4903 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e210f8c-e29d-442c-a5eb-ec6b639b0275-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.343868 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcpp5\" (UniqueName: \"kubernetes.io/projected/b26d1664-feac-422f-a058-7e4f798a7b45-kube-api-access-mcpp5\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.343880 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26d1664-feac-422f-a058-7e4f798a7b45-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.343889 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.343897 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.343906 4903 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.343913 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.343922 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcqt7\" (UniqueName: \"kubernetes.io/projected/6e210f8c-e29d-442c-a5eb-ec6b639b0275-kube-api-access-kcqt7\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.343930 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.346271 4903 scope.go:117] "RemoveContainer" containerID="8a6cd303bb413a19cc42c41d551839a5c95988228323fc457e9b609cb5d52414" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.385413 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b26d1664-feac-422f-a058-7e4f798a7b45" (UID: "b26d1664-feac-422f-a058-7e4f798a7b45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.400856 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-config-data" (OuterVolumeSpecName: "config-data") pod "6e210f8c-e29d-442c-a5eb-ec6b639b0275" (UID: "6e210f8c-e29d-442c-a5eb-ec6b639b0275"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: E0320 08:45:55.444019 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.448341 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26d1664-feac-422f-a058-7e4f798a7b45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.448380 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e210f8c-e29d-442c-a5eb-ec6b639b0275-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.508832 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" path="/var/lib/kubelet/pods/caba59be-ac50-4fe8-9f72-cdbfe69ea01e/volumes" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.567309 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.653749 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-svc\") pod \"b95e8341-9f65-461f-891b-ec6512be57f7\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.653842 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-config\") pod \"b95e8341-9f65-461f-891b-ec6512be57f7\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.653862 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-nb\") pod \"b95e8341-9f65-461f-891b-ec6512be57f7\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.653926 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-swift-storage-0\") pod \"b95e8341-9f65-461f-891b-ec6512be57f7\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.653999 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-sb\") pod \"b95e8341-9f65-461f-891b-ec6512be57f7\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.654094 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9kmk\" (UniqueName: \"kubernetes.io/projected/b95e8341-9f65-461f-891b-ec6512be57f7-kube-api-access-v9kmk\") pod \"b95e8341-9f65-461f-891b-ec6512be57f7\" (UID: \"b95e8341-9f65-461f-891b-ec6512be57f7\") " Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.702333 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b95e8341-9f65-461f-891b-ec6512be57f7-kube-api-access-v9kmk" (OuterVolumeSpecName: "kube-api-access-v9kmk") pod "b95e8341-9f65-461f-891b-ec6512be57f7" (UID: "b95e8341-9f65-461f-891b-ec6512be57f7"). InnerVolumeSpecName "kube-api-access-v9kmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.708390 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-config" (OuterVolumeSpecName: "config") pod "b95e8341-9f65-461f-891b-ec6512be57f7" (UID: "b95e8341-9f65-461f-891b-ec6512be57f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.711356 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b95e8341-9f65-461f-891b-ec6512be57f7" (UID: "b95e8341-9f65-461f-891b-ec6512be57f7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.713392 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b95e8341-9f65-461f-891b-ec6512be57f7" (UID: "b95e8341-9f65-461f-891b-ec6512be57f7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.715335 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b95e8341-9f65-461f-891b-ec6512be57f7" (UID: "b95e8341-9f65-461f-891b-ec6512be57f7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.724016 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b95e8341-9f65-461f-891b-ec6512be57f7" (UID: "b95e8341-9f65-461f-891b-ec6512be57f7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.759707 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9kmk\" (UniqueName: \"kubernetes.io/projected/b95e8341-9f65-461f-891b-ec6512be57f7-kube-api-access-v9kmk\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.759764 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.759776 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.759787 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.759795 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.759805 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b95e8341-9f65-461f-891b-ec6512be57f7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.922939 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v6rfn"] Mar 20 08:45:55 crc kubenswrapper[4903]: E0320 08:45:55.934339 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" containerName="barbican-keystone-listener" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.934391 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" containerName="barbican-keystone-listener" Mar 20 08:45:55 crc kubenswrapper[4903]: E0320 08:45:55.934414 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95e8341-9f65-461f-891b-ec6512be57f7" containerName="dnsmasq-dns" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.934421 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95e8341-9f65-461f-891b-ec6512be57f7" containerName="dnsmasq-dns" Mar 20 08:45:55 crc kubenswrapper[4903]: E0320 08:45:55.934454 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e210f8c-e29d-442c-a5eb-ec6b639b0275" containerName="cinder-db-sync" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.934462 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e210f8c-e29d-442c-a5eb-ec6b639b0275" containerName="cinder-db-sync" Mar 20 08:45:55 crc kubenswrapper[4903]: E0320 08:45:55.934479 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26d1664-feac-422f-a058-7e4f798a7b45" containerName="barbican-worker-log" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.934486 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26d1664-feac-422f-a058-7e4f798a7b45" containerName="barbican-worker-log" Mar 20 08:45:55 crc kubenswrapper[4903]: E0320 08:45:55.934503 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95e8341-9f65-461f-891b-ec6512be57f7" containerName="init" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.934509 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95e8341-9f65-461f-891b-ec6512be57f7" containerName="init" Mar 20 08:45:55 crc kubenswrapper[4903]: E0320 08:45:55.934523 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26d1664-feac-422f-a058-7e4f798a7b45" containerName="barbican-worker" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.934531 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26d1664-feac-422f-a058-7e4f798a7b45" containerName="barbican-worker" Mar 20 08:45:55 crc kubenswrapper[4903]: E0320 08:45:55.934552 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" containerName="barbican-keystone-listener-log" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.934558 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" containerName="barbican-keystone-listener-log" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.934997 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e210f8c-e29d-442c-a5eb-ec6b639b0275" containerName="cinder-db-sync" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.935051 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26d1664-feac-422f-a058-7e4f798a7b45" containerName="barbican-worker" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.935067 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" containerName="barbican-keystone-listener-log" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.935077 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b95e8341-9f65-461f-891b-ec6512be57f7" containerName="dnsmasq-dns" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.935099 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="caba59be-ac50-4fe8-9f72-cdbfe69ea01e" containerName="barbican-keystone-listener" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.935114 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26d1664-feac-422f-a058-7e4f798a7b45" containerName="barbican-worker-log" Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.940822 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v6rfn"] Mar 20 08:45:55 crc kubenswrapper[4903]: I0320 08:45:55.940966 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.070457 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-catalog-content\") pod \"certified-operators-v6rfn\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.070892 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-utilities\") pod \"certified-operators-v6rfn\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.071146 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmhht\" (UniqueName: \"kubernetes.io/projected/d46db123-b34a-40fe-b42d-846f9788745b-kube-api-access-zmhht\") pod \"certified-operators-v6rfn\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.172905 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-utilities\") pod \"certified-operators-v6rfn\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.172968 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmhht\" (UniqueName: \"kubernetes.io/projected/d46db123-b34a-40fe-b42d-846f9788745b-kube-api-access-zmhht\") pod \"certified-operators-v6rfn\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.173044 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-catalog-content\") pod \"certified-operators-v6rfn\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.173676 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-utilities\") pod \"certified-operators-v6rfn\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.173696 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-catalog-content\") pod \"certified-operators-v6rfn\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.178404 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"003c0ace-6aef-4bc2-bc02-358cf140d4ce","Type":"ContainerStarted","Data":"b7ed8ed55adb45a5ac94b9c22cbdffbb71e84ecce0530a064400b927315e05bd"} Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.178566 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="ceilometer-notification-agent" containerID="cri-o://5f6acda290853c0679341a80d768ac082e84e70b79924dbd01bab6e923434f1e" gracePeriod=30 Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.178599 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.178660 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="proxy-httpd" containerID="cri-o://b7ed8ed55adb45a5ac94b9c22cbdffbb71e84ecce0530a064400b927315e05bd" gracePeriod=30 Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.178766 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="sg-core" containerID="cri-o://f62844ed56c7e0157412ee30b02d4935401b05855b172e4f150a32074e1b5271" gracePeriod=30 Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.187892 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"6f3554d39d020685d0868b6a191ed76faf2b526f6e3fae809ae7af6a7a7f9269"} Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.199467 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" event={"ID":"b95e8341-9f65-461f-891b-ec6512be57f7","Type":"ContainerDied","Data":"6fbdff3a1b304d2fc79262f059cb1b0d998e9205aa0ca07bd62ea57418425d9f"} Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.199795 4903 scope.go:117] "RemoveContainer" containerID="49c1a0f1b9ba0addd74e68afa439b435d66004b7b86a4a212551513668453686" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.199518 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.206652 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmhht\" (UniqueName: \"kubernetes.io/projected/d46db123-b34a-40fe-b42d-846f9788745b-kube-api-access-zmhht\") pod \"certified-operators-v6rfn\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.225163 4903 scope.go:117] "RemoveContainer" containerID="e0f512692c5d2627130fb72028dde3138ce12d203c6f16dd5bb9fe2db0ecf739" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.255979 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl9ms"] Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.261058 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-nl9ms"] Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.268721 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.476390 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9g5kv"] Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.478908 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.536362 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9g5kv"] Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.555843 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.558229 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.599339 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fm82s" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.599635 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.599759 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.599880 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.634912 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.702544 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-config\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.702617 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49828\" (UniqueName: \"kubernetes.io/projected/6f94af42-d7a9-437c-a74b-5d63fcd63a50-kube-api-access-49828\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.702685 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.702781 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.702900 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.702975 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.730311 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.736051 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.740526 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.755308 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804280 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804580 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804606 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804625 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-scripts\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804648 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804670 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-config\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804687 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49828\" (UniqueName: \"kubernetes.io/projected/6f94af42-d7a9-437c-a74b-5d63fcd63a50-kube-api-access-49828\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804717 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804752 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804786 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804837 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8sf\" (UniqueName: \"kubernetes.io/projected/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-kube-api-access-8x8sf\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.804864 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.805913 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.805943 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-config\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.806543 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.807179 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.807195 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.842020 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49828\" (UniqueName: \"kubernetes.io/projected/6f94af42-d7a9-437c-a74b-5d63fcd63a50-kube-api-access-49828\") pod \"dnsmasq-dns-5c9776ccc5-9g5kv\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906118 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906178 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/482e4efe-5d05-4aa5-b77c-a366193c4b50-etc-machine-id\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906198 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906232 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906249 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data-custom\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906275 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4w8p\" (UniqueName: \"kubernetes.io/projected/482e4efe-5d05-4aa5-b77c-a366193c4b50-kube-api-access-q4w8p\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906329 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x8sf\" (UniqueName: \"kubernetes.io/projected/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-kube-api-access-8x8sf\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906360 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-scripts\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906380 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906405 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906422 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/482e4efe-5d05-4aa5-b77c-a366193c4b50-logs\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906442 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-scripts\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.906460 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.907478 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.915331 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.916084 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.918679 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-scripts\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.933483 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:56 crc kubenswrapper[4903]: I0320 08:45:56.948006 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x8sf\" (UniqueName: \"kubernetes.io/projected/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-kube-api-access-8x8sf\") pod \"cinder-scheduler-0\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " pod="openstack/cinder-scheduler-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.009974 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/482e4efe-5d05-4aa5-b77c-a366193c4b50-logs\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.010077 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.010105 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/482e4efe-5d05-4aa5-b77c-a366193c4b50-etc-machine-id\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.010119 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.010221 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/482e4efe-5d05-4aa5-b77c-a366193c4b50-etc-machine-id\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.011190 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/482e4efe-5d05-4aa5-b77c-a366193c4b50-logs\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.011238 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data-custom\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.011276 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4w8p\" (UniqueName: \"kubernetes.io/projected/482e4efe-5d05-4aa5-b77c-a366193c4b50-kube-api-access-q4w8p\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.011377 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-scripts\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.018841 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.028370 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.029120 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data-custom\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.037826 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-scripts\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.051726 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4w8p\" (UniqueName: \"kubernetes.io/projected/482e4efe-5d05-4aa5-b77c-a366193c4b50-kube-api-access-q4w8p\") pod \"cinder-api-0\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.079687 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.092504 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v6rfn"] Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.146134 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.236301 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6rfn" event={"ID":"d46db123-b34a-40fe-b42d-846f9788745b","Type":"ContainerStarted","Data":"1ee24c6bf81756a537defbc2f18475f633ba90bcbf60b8f5bc349a20cb2df27f"} Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.243961 4903 generic.go:334] "Generic (PLEG): container finished" podID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerID="b7ed8ed55adb45a5ac94b9c22cbdffbb71e84ecce0530a064400b927315e05bd" exitCode=0 Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.243992 4903 generic.go:334] "Generic (PLEG): container finished" podID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerID="f62844ed56c7e0157412ee30b02d4935401b05855b172e4f150a32074e1b5271" exitCode=2 Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.244190 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"003c0ace-6aef-4bc2-bc02-358cf140d4ce","Type":"ContainerDied","Data":"b7ed8ed55adb45a5ac94b9c22cbdffbb71e84ecce0530a064400b927315e05bd"} Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.244305 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"003c0ace-6aef-4bc2-bc02-358cf140d4ce","Type":"ContainerDied","Data":"f62844ed56c7e0157412ee30b02d4935401b05855b172e4f150a32074e1b5271"} Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.244741 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.545512 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b95e8341-9f65-461f-891b-ec6512be57f7" path="/var/lib/kubelet/pods/b95e8341-9f65-461f-891b-ec6512be57f7/volumes" Mar 20 08:45:57 crc kubenswrapper[4903]: I0320 08:45:57.885569 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.230730 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9g5kv"] Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.269072 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"482e4efe-5d05-4aa5-b77c-a366193c4b50","Type":"ContainerStarted","Data":"2a71cfad819a9664900d508ed1b8a40864520b0ab9dbe7da37ac890a3b97a6c6"} Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.277512 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" event={"ID":"6f94af42-d7a9-437c-a74b-5d63fcd63a50","Type":"ContainerStarted","Data":"b4f7c2f895a5fe2ca3dbcc01938495afa9e17dc1301bad958c681edcc5c8a140"} Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.280637 4903 generic.go:334] "Generic (PLEG): container finished" podID="d46db123-b34a-40fe-b42d-846f9788745b" containerID="75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9" exitCode=0 Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.280707 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6rfn" event={"ID":"d46db123-b34a-40fe-b42d-846f9788745b","Type":"ContainerDied","Data":"75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9"} Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.293405 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"003c0ace-6aef-4bc2-bc02-358cf140d4ce","Type":"ContainerDied","Data":"5f6acda290853c0679341a80d768ac082e84e70b79924dbd01bab6e923434f1e"} Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.293620 4903 generic.go:334] "Generic (PLEG): container finished" podID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerID="5f6acda290853c0679341a80d768ac082e84e70b79924dbd01bab6e923434f1e" exitCode=0 Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.359877 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:45:58 crc kubenswrapper[4903]: W0320 08:45:58.374909 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc84a69c5_84f8_42d3_a383_4f1c5eb17b5e.slice/crio-a0c47778c1d6928c802f192c1be4632e4a3fd7b53e8bb58217093e5add3d252f WatchSource:0}: Error finding container a0c47778c1d6928c802f192c1be4632e4a3fd7b53e8bb58217093e5add3d252f: Status 404 returned error can't find the container with id a0c47778c1d6928c802f192c1be4632e4a3fd7b53e8bb58217093e5add3d252f Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.544139 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.575223 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-config-data\") pod \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.575270 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-scripts\") pod \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.575305 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-log-httpd\") pod \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.575331 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-run-httpd\") pod \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.575373 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-sg-core-conf-yaml\") pod \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.575489 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-combined-ca-bundle\") pod \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.575538 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djzj6\" (UniqueName: \"kubernetes.io/projected/003c0ace-6aef-4bc2-bc02-358cf140d4ce-kube-api-access-djzj6\") pod \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\" (UID: \"003c0ace-6aef-4bc2-bc02-358cf140d4ce\") " Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.577131 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "003c0ace-6aef-4bc2-bc02-358cf140d4ce" (UID: "003c0ace-6aef-4bc2-bc02-358cf140d4ce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.577567 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "003c0ace-6aef-4bc2-bc02-358cf140d4ce" (UID: "003c0ace-6aef-4bc2-bc02-358cf140d4ce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.592423 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-scripts" (OuterVolumeSpecName: "scripts") pod "003c0ace-6aef-4bc2-bc02-358cf140d4ce" (UID: "003c0ace-6aef-4bc2-bc02-358cf140d4ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.594225 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003c0ace-6aef-4bc2-bc02-358cf140d4ce-kube-api-access-djzj6" (OuterVolumeSpecName: "kube-api-access-djzj6") pod "003c0ace-6aef-4bc2-bc02-358cf140d4ce" (UID: "003c0ace-6aef-4bc2-bc02-358cf140d4ce"). InnerVolumeSpecName "kube-api-access-djzj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.678921 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.678955 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.678969 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/003c0ace-6aef-4bc2-bc02-358cf140d4ce-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.678981 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djzj6\" (UniqueName: \"kubernetes.io/projected/003c0ace-6aef-4bc2-bc02-358cf140d4ce-kube-api-access-djzj6\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.684994 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "003c0ace-6aef-4bc2-bc02-358cf140d4ce" (UID: "003c0ace-6aef-4bc2-bc02-358cf140d4ce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.690480 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "003c0ace-6aef-4bc2-bc02-358cf140d4ce" (UID: "003c0ace-6aef-4bc2-bc02-358cf140d4ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.720248 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-config-data" (OuterVolumeSpecName: "config-data") pod "003c0ace-6aef-4bc2-bc02-358cf140d4ce" (UID: "003c0ace-6aef-4bc2-bc02-358cf140d4ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.779979 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.780014 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:58 crc kubenswrapper[4903]: I0320 08:45:58.780024 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/003c0ace-6aef-4bc2-bc02-358cf140d4ce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.320347 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"003c0ace-6aef-4bc2-bc02-358cf140d4ce","Type":"ContainerDied","Data":"657b5af52a5448870f3b62e12d359dc18bbcffdb6f93c897dcf113e70b47bccf"} Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.320614 4903 scope.go:117] "RemoveContainer" containerID="b7ed8ed55adb45a5ac94b9c22cbdffbb71e84ecce0530a064400b927315e05bd" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.320447 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.325830 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e","Type":"ContainerStarted","Data":"a0c47778c1d6928c802f192c1be4632e4a3fd7b53e8bb58217093e5add3d252f"} Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.327187 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.328093 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"482e4efe-5d05-4aa5-b77c-a366193c4b50","Type":"ContainerStarted","Data":"9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75"} Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.335385 4903 generic.go:334] "Generic (PLEG): container finished" podID="6f94af42-d7a9-437c-a74b-5d63fcd63a50" containerID="097edbddde97da45f7e0b48e947c34e3ca361dc532c950791e2d51a9a9cd496d" exitCode=0 Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.335444 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" event={"ID":"6f94af42-d7a9-437c-a74b-5d63fcd63a50","Type":"ContainerDied","Data":"097edbddde97da45f7e0b48e947c34e3ca361dc532c950791e2d51a9a9cd496d"} Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.386444 4903 scope.go:117] "RemoveContainer" containerID="f62844ed56c7e0157412ee30b02d4935401b05855b172e4f150a32074e1b5271" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.408718 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.465626 4903 scope.go:117] "RemoveContainer" containerID="5f6acda290853c0679341a80d768ac082e84e70b79924dbd01bab6e923434f1e" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.560874 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.603927 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.669402 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:45:59 crc kubenswrapper[4903]: E0320 08:45:59.682143 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="sg-core" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.682186 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="sg-core" Mar 20 08:45:59 crc kubenswrapper[4903]: E0320 08:45:59.682227 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="proxy-httpd" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.682234 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="proxy-httpd" Mar 20 08:45:59 crc kubenswrapper[4903]: E0320 08:45:59.682249 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="ceilometer-notification-agent" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.682255 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="ceilometer-notification-agent" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.682953 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="proxy-httpd" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.682977 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="ceilometer-notification-agent" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.682993 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" containerName="sg-core" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.737760 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.738324 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.741540 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.741769 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.765859 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75dfbf8d4b-8vdlx"] Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.766159 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerName="barbican-api-log" containerID="cri-o://7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4" gracePeriod=30 Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.766309 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerName="barbican-api" containerID="cri-o://e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7" gracePeriod=30 Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.796240 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.835687 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-run-httpd\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.835808 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-scripts\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.835864 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5nm7\" (UniqueName: \"kubernetes.io/projected/b603178f-e90b-4e08-ad82-0d15ddc32844-kube-api-access-g5nm7\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.835933 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.835992 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-log-httpd\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.836014 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.836062 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-config-data\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.938382 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nm7\" (UniqueName: \"kubernetes.io/projected/b603178f-e90b-4e08-ad82-0d15ddc32844-kube-api-access-g5nm7\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.938524 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.938630 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-log-httpd\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.938663 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.938685 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-config-data\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.938755 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-run-httpd\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.938809 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-scripts\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.939302 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-run-httpd\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.939304 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-log-httpd\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.943164 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.944139 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-scripts\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.948586 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.948761 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-config-data\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:45:59 crc kubenswrapper[4903]: I0320 08:45:59.959962 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nm7\" (UniqueName: \"kubernetes.io/projected/b603178f-e90b-4e08-ad82-0d15ddc32844-kube-api-access-g5nm7\") pod \"ceilometer-0\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " pod="openstack/ceilometer-0" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.085722 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.132147 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-nl9ms" podUID="b95e8341-9f65-461f-891b-ec6512be57f7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: i/o timeout" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.169681 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566606-6n2pc"] Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.171334 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566606-6n2pc" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.180492 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.180721 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.187276 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.197353 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566606-6n2pc"] Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.250016 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnpm\" (UniqueName: \"kubernetes.io/projected/70a601b8-5c2b-4b68-a9b1-d1434eab6965-kube-api-access-rvnpm\") pod \"auto-csr-approver-29566606-6n2pc\" (UID: \"70a601b8-5c2b-4b68-a9b1-d1434eab6965\") " pod="openshift-infra/auto-csr-approver-29566606-6n2pc" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.353246 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnpm\" (UniqueName: \"kubernetes.io/projected/70a601b8-5c2b-4b68-a9b1-d1434eab6965-kube-api-access-rvnpm\") pod \"auto-csr-approver-29566606-6n2pc\" (UID: \"70a601b8-5c2b-4b68-a9b1-d1434eab6965\") " pod="openshift-infra/auto-csr-approver-29566606-6n2pc" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.367543 4903 generic.go:334] "Generic (PLEG): container finished" podID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerID="7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4" exitCode=143 Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.367625 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" event={"ID":"079887bb-d4be-4d23-bf74-3332bfd2f7cb","Type":"ContainerDied","Data":"7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4"} Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.381659 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnpm\" (UniqueName: \"kubernetes.io/projected/70a601b8-5c2b-4b68-a9b1-d1434eab6965-kube-api-access-rvnpm\") pod \"auto-csr-approver-29566606-6n2pc\" (UID: \"70a601b8-5c2b-4b68-a9b1-d1434eab6965\") " pod="openshift-infra/auto-csr-approver-29566606-6n2pc" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.384707 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="482e4efe-5d05-4aa5-b77c-a366193c4b50" containerName="cinder-api-log" containerID="cri-o://9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75" gracePeriod=30 Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.384666 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"482e4efe-5d05-4aa5-b77c-a366193c4b50","Type":"ContainerStarted","Data":"ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701"} Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.385263 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.389188 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="482e4efe-5d05-4aa5-b77c-a366193c4b50" containerName="cinder-api" containerID="cri-o://ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701" gracePeriod=30 Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.407054 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" event={"ID":"6f94af42-d7a9-437c-a74b-5d63fcd63a50","Type":"ContainerStarted","Data":"4a1e2756c85ed1d55a1644e0e4b58a25a84cb2a386791c2b75b8030cee67d6ac"} Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.416488 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.430605 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.4305708280000005 podStartE2EDuration="4.430570828s" podCreationTimestamp="2026-03-20 08:45:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:46:00.421585859 +0000 UTC m=+1385.638486174" watchObservedRunningTime="2026-03-20 08:46:00.430570828 +0000 UTC m=+1385.647471153" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.449991 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" podStartSLOduration=4.4499702 podStartE2EDuration="4.4499702s" podCreationTimestamp="2026-03-20 08:45:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:46:00.447331838 +0000 UTC m=+1385.664232153" watchObservedRunningTime="2026-03-20 08:46:00.4499702 +0000 UTC m=+1385.666870515" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.499327 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566606-6n2pc" Mar 20 08:46:00 crc kubenswrapper[4903]: I0320 08:46:00.751826 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.150589 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566606-6n2pc"] Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.190861 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.282475 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4w8p\" (UniqueName: \"kubernetes.io/projected/482e4efe-5d05-4aa5-b77c-a366193c4b50-kube-api-access-q4w8p\") pod \"482e4efe-5d05-4aa5-b77c-a366193c4b50\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.282880 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-scripts\") pod \"482e4efe-5d05-4aa5-b77c-a366193c4b50\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.282924 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/482e4efe-5d05-4aa5-b77c-a366193c4b50-logs\") pod \"482e4efe-5d05-4aa5-b77c-a366193c4b50\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.282976 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data\") pod \"482e4efe-5d05-4aa5-b77c-a366193c4b50\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.283086 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-combined-ca-bundle\") pod \"482e4efe-5d05-4aa5-b77c-a366193c4b50\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.283159 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data-custom\") pod \"482e4efe-5d05-4aa5-b77c-a366193c4b50\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.283227 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/482e4efe-5d05-4aa5-b77c-a366193c4b50-etc-machine-id\") pod \"482e4efe-5d05-4aa5-b77c-a366193c4b50\" (UID: \"482e4efe-5d05-4aa5-b77c-a366193c4b50\") " Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.283791 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/482e4efe-5d05-4aa5-b77c-a366193c4b50-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "482e4efe-5d05-4aa5-b77c-a366193c4b50" (UID: "482e4efe-5d05-4aa5-b77c-a366193c4b50"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.284185 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/482e4efe-5d05-4aa5-b77c-a366193c4b50-logs" (OuterVolumeSpecName: "logs") pod "482e4efe-5d05-4aa5-b77c-a366193c4b50" (UID: "482e4efe-5d05-4aa5-b77c-a366193c4b50"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.288931 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482e4efe-5d05-4aa5-b77c-a366193c4b50-kube-api-access-q4w8p" (OuterVolumeSpecName: "kube-api-access-q4w8p") pod "482e4efe-5d05-4aa5-b77c-a366193c4b50" (UID: "482e4efe-5d05-4aa5-b77c-a366193c4b50"). InnerVolumeSpecName "kube-api-access-q4w8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.293202 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-scripts" (OuterVolumeSpecName: "scripts") pod "482e4efe-5d05-4aa5-b77c-a366193c4b50" (UID: "482e4efe-5d05-4aa5-b77c-a366193c4b50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.298129 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "482e4efe-5d05-4aa5-b77c-a366193c4b50" (UID: "482e4efe-5d05-4aa5-b77c-a366193c4b50"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.327166 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "482e4efe-5d05-4aa5-b77c-a366193c4b50" (UID: "482e4efe-5d05-4aa5-b77c-a366193c4b50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.373172 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data" (OuterVolumeSpecName: "config-data") pod "482e4efe-5d05-4aa5-b77c-a366193c4b50" (UID: "482e4efe-5d05-4aa5-b77c-a366193c4b50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.387194 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.387246 4903 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/482e4efe-5d05-4aa5-b77c-a366193c4b50-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.387258 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4w8p\" (UniqueName: \"kubernetes.io/projected/482e4efe-5d05-4aa5-b77c-a366193c4b50-kube-api-access-q4w8p\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.387273 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.387287 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/482e4efe-5d05-4aa5-b77c-a366193c4b50-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.387295 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.387306 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/482e4efe-5d05-4aa5-b77c-a366193c4b50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.422624 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b603178f-e90b-4e08-ad82-0d15ddc32844","Type":"ContainerStarted","Data":"b61defaaf30760bc9cc66413aab61f92d86cba994535ce412604ccc05c9c6424"} Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.425604 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e","Type":"ContainerStarted","Data":"03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd"} Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.428061 4903 generic.go:334] "Generic (PLEG): container finished" podID="482e4efe-5d05-4aa5-b77c-a366193c4b50" containerID="ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701" exitCode=0 Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.428092 4903 generic.go:334] "Generic (PLEG): container finished" podID="482e4efe-5d05-4aa5-b77c-a366193c4b50" containerID="9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75" exitCode=143 Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.428132 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"482e4efe-5d05-4aa5-b77c-a366193c4b50","Type":"ContainerDied","Data":"ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701"} Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.428155 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"482e4efe-5d05-4aa5-b77c-a366193c4b50","Type":"ContainerDied","Data":"9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75"} Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.428168 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"482e4efe-5d05-4aa5-b77c-a366193c4b50","Type":"ContainerDied","Data":"2a71cfad819a9664900d508ed1b8a40864520b0ab9dbe7da37ac890a3b97a6c6"} Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.428190 4903 scope.go:117] "RemoveContainer" containerID="ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.428395 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.433958 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6rfn" event={"ID":"d46db123-b34a-40fe-b42d-846f9788745b","Type":"ContainerStarted","Data":"bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae"} Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.436853 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566606-6n2pc" event={"ID":"70a601b8-5c2b-4b68-a9b1-d1434eab6965","Type":"ContainerStarted","Data":"80cc914951c73153c76b2048acdd1cf15c6ab8c3747b9acc00f0046794552a86"} Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.503180 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003c0ace-6aef-4bc2-bc02-358cf140d4ce" path="/var/lib/kubelet/pods/003c0ace-6aef-4bc2-bc02-358cf140d4ce/volumes" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.562705 4903 scope.go:117] "RemoveContainer" containerID="9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.592207 4903 scope.go:117] "RemoveContainer" containerID="ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701" Mar 20 08:46:01 crc kubenswrapper[4903]: E0320 08:46:01.592690 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701\": container with ID starting with ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701 not found: ID does not exist" containerID="ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.592738 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701"} err="failed to get container status \"ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701\": rpc error: code = NotFound desc = could not find container \"ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701\": container with ID starting with ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701 not found: ID does not exist" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.592767 4903 scope.go:117] "RemoveContainer" containerID="9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75" Mar 20 08:46:01 crc kubenswrapper[4903]: E0320 08:46:01.593052 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75\": container with ID starting with 9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75 not found: ID does not exist" containerID="9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.593085 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75"} err="failed to get container status \"9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75\": rpc error: code = NotFound desc = could not find container \"9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75\": container with ID starting with 9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75 not found: ID does not exist" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.593108 4903 scope.go:117] "RemoveContainer" containerID="ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.593477 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701"} err="failed to get container status \"ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701\": rpc error: code = NotFound desc = could not find container \"ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701\": container with ID starting with ced8b8431173df6572e1f0f77c42ab6aacc77fdafc1ce2371b657c8d01276701 not found: ID does not exist" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.593533 4903 scope.go:117] "RemoveContainer" containerID="9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75" Mar 20 08:46:01 crc kubenswrapper[4903]: I0320 08:46:01.594084 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75"} err="failed to get container status \"9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75\": rpc error: code = NotFound desc = could not find container \"9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75\": container with ID starting with 9a147042167b710612f7777620b5e17c2910bcdfd5ffefba62d263d0705b7f75 not found: ID does not exist" Mar 20 08:46:02 crc kubenswrapper[4903]: I0320 08:46:02.453934 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b603178f-e90b-4e08-ad82-0d15ddc32844","Type":"ContainerStarted","Data":"dc3f13562bfffe676ba3cb75ad7d3e2e26c1a2b70e23596ae455c73f5af5d8bf"} Mar 20 08:46:02 crc kubenswrapper[4903]: I0320 08:46:02.456496 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e","Type":"ContainerStarted","Data":"933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7"} Mar 20 08:46:02 crc kubenswrapper[4903]: I0320 08:46:02.463722 4903 generic.go:334] "Generic (PLEG): container finished" podID="d46db123-b34a-40fe-b42d-846f9788745b" containerID="bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae" exitCode=0 Mar 20 08:46:02 crc kubenswrapper[4903]: I0320 08:46:02.463974 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6rfn" event={"ID":"d46db123-b34a-40fe-b42d-846f9788745b","Type":"ContainerDied","Data":"bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae"} Mar 20 08:46:02 crc kubenswrapper[4903]: I0320 08:46:02.483652 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.858573998 podStartE2EDuration="6.483624606s" podCreationTimestamp="2026-03-20 08:45:56 +0000 UTC" firstStartedPulling="2026-03-20 08:45:58.381444461 +0000 UTC m=+1383.598344776" lastFinishedPulling="2026-03-20 08:46:00.006495069 +0000 UTC m=+1385.223395384" observedRunningTime="2026-03-20 08:46:02.481492157 +0000 UTC m=+1387.698392482" watchObservedRunningTime="2026-03-20 08:46:02.483624606 +0000 UTC m=+1387.700524931" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.451479 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.477729 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b603178f-e90b-4e08-ad82-0d15ddc32844","Type":"ContainerStarted","Data":"23de93f09c8f56d1d5ad7e078619ee8d784563bdc9c71699161a829e7408c456"} Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.477779 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b603178f-e90b-4e08-ad82-0d15ddc32844","Type":"ContainerStarted","Data":"b14c8d93b7883541bc880c20a4c3da5ee1c79e090d432fcaa5c61c14083dee3d"} Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.483770 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6rfn" event={"ID":"d46db123-b34a-40fe-b42d-846f9788745b","Type":"ContainerStarted","Data":"ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1"} Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.486014 4903 generic.go:334] "Generic (PLEG): container finished" podID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerID="e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7" exitCode=0 Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.486129 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" event={"ID":"079887bb-d4be-4d23-bf74-3332bfd2f7cb","Type":"ContainerDied","Data":"e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7"} Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.486157 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" event={"ID":"079887bb-d4be-4d23-bf74-3332bfd2f7cb","Type":"ContainerDied","Data":"bc9fc1a199781868d3417d58ed6a80f04d5ca374cb5aae24187974996eb51bd1"} Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.486175 4903 scope.go:117] "RemoveContainer" containerID="e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.486317 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.509108 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v6rfn" podStartSLOduration=3.830111512 podStartE2EDuration="8.5090194s" podCreationTimestamp="2026-03-20 08:45:55 +0000 UTC" firstStartedPulling="2026-03-20 08:45:58.28731267 +0000 UTC m=+1383.504212985" lastFinishedPulling="2026-03-20 08:46:02.966220558 +0000 UTC m=+1388.183120873" observedRunningTime="2026-03-20 08:46:03.506099342 +0000 UTC m=+1388.722999677" watchObservedRunningTime="2026-03-20 08:46:03.5090194 +0000 UTC m=+1388.725919715" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.511500 4903 generic.go:334] "Generic (PLEG): container finished" podID="70a601b8-5c2b-4b68-a9b1-d1434eab6965" containerID="228bbc6c0201c5b8f848a584ece52afe6a644153fe624d7108fce309a34c9341" exitCode=0 Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.548453 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566606-6n2pc" event={"ID":"70a601b8-5c2b-4b68-a9b1-d1434eab6965","Type":"ContainerDied","Data":"228bbc6c0201c5b8f848a584ece52afe6a644153fe624d7108fce309a34c9341"} Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.551909 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfbt2\" (UniqueName: \"kubernetes.io/projected/079887bb-d4be-4d23-bf74-3332bfd2f7cb-kube-api-access-nfbt2\") pod \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.552027 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-combined-ca-bundle\") pod \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.552118 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data\") pod \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.552156 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data-custom\") pod \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.552195 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/079887bb-d4be-4d23-bf74-3332bfd2f7cb-logs\") pod \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\" (UID: \"079887bb-d4be-4d23-bf74-3332bfd2f7cb\") " Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.552907 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/079887bb-d4be-4d23-bf74-3332bfd2f7cb-logs" (OuterVolumeSpecName: "logs") pod "079887bb-d4be-4d23-bf74-3332bfd2f7cb" (UID: "079887bb-d4be-4d23-bf74-3332bfd2f7cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.556642 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "079887bb-d4be-4d23-bf74-3332bfd2f7cb" (UID: "079887bb-d4be-4d23-bf74-3332bfd2f7cb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.557591 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079887bb-d4be-4d23-bf74-3332bfd2f7cb-kube-api-access-nfbt2" (OuterVolumeSpecName: "kube-api-access-nfbt2") pod "079887bb-d4be-4d23-bf74-3332bfd2f7cb" (UID: "079887bb-d4be-4d23-bf74-3332bfd2f7cb"). InnerVolumeSpecName "kube-api-access-nfbt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.567389 4903 scope.go:117] "RemoveContainer" containerID="7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.582902 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "079887bb-d4be-4d23-bf74-3332bfd2f7cb" (UID: "079887bb-d4be-4d23-bf74-3332bfd2f7cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.604170 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data" (OuterVolumeSpecName: "config-data") pod "079887bb-d4be-4d23-bf74-3332bfd2f7cb" (UID: "079887bb-d4be-4d23-bf74-3332bfd2f7cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.654760 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.655895 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.655985 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/079887bb-d4be-4d23-bf74-3332bfd2f7cb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.656081 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/079887bb-d4be-4d23-bf74-3332bfd2f7cb-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.656175 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfbt2\" (UniqueName: \"kubernetes.io/projected/079887bb-d4be-4d23-bf74-3332bfd2f7cb-kube-api-access-nfbt2\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.690087 4903 scope.go:117] "RemoveContainer" containerID="e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7" Mar 20 08:46:03 crc kubenswrapper[4903]: E0320 08:46:03.690690 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7\": container with ID starting with e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7 not found: ID does not exist" containerID="e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.690749 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7"} err="failed to get container status \"e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7\": rpc error: code = NotFound desc = could not find container \"e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7\": container with ID starting with e595fcfa366bef36702fb2a4c3a07c0c0a2c9adab6726b58179ebe11c36c4cc7 not found: ID does not exist" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.690787 4903 scope.go:117] "RemoveContainer" containerID="7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4" Mar 20 08:46:03 crc kubenswrapper[4903]: E0320 08:46:03.691161 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4\": container with ID starting with 7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4 not found: ID does not exist" containerID="7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.691271 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4"} err="failed to get container status \"7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4\": rpc error: code = NotFound desc = could not find container \"7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4\": container with ID starting with 7353dcb5dc133727414ee6fffb1afdd81aca3a54cbf6761fe3463c12bbf3d7b4 not found: ID does not exist" Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.836405 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75dfbf8d4b-8vdlx"] Mar 20 08:46:03 crc kubenswrapper[4903]: I0320 08:46:03.857950 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-75dfbf8d4b-8vdlx"] Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.570207 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.872463 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b9df974b5-rb8w6"] Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.872946 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b9df974b5-rb8w6" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerName="neutron-api" containerID="cri-o://637459508e1cd8681366e71e955fbcb1a32426e408ddaa26278208437c4f5836" gracePeriod=30 Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.873133 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b9df974b5-rb8w6" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerName="neutron-httpd" containerID="cri-o://a80ef4791d38f5c70e3b66329fee2898f3f9305859e05349a8ed7e0e8454b362" gracePeriod=30 Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.914103 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fbdb5-dn8w8"] Mar 20 08:46:04 crc kubenswrapper[4903]: E0320 08:46:04.914574 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerName="barbican-api" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.914598 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerName="barbican-api" Mar 20 08:46:04 crc kubenswrapper[4903]: E0320 08:46:04.914636 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482e4efe-5d05-4aa5-b77c-a366193c4b50" containerName="cinder-api" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.914644 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="482e4efe-5d05-4aa5-b77c-a366193c4b50" containerName="cinder-api" Mar 20 08:46:04 crc kubenswrapper[4903]: E0320 08:46:04.914659 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerName="barbican-api-log" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.914665 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerName="barbican-api-log" Mar 20 08:46:04 crc kubenswrapper[4903]: E0320 08:46:04.914679 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482e4efe-5d05-4aa5-b77c-a366193c4b50" containerName="cinder-api-log" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.914685 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="482e4efe-5d05-4aa5-b77c-a366193c4b50" containerName="cinder-api-log" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.914870 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerName="barbican-api" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.914886 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="482e4efe-5d05-4aa5-b77c-a366193c4b50" containerName="cinder-api-log" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.914897 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="482e4efe-5d05-4aa5-b77c-a366193c4b50" containerName="cinder-api" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.914910 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerName="barbican-api-log" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.915891 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.936522 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fbdb5-dn8w8"] Mar 20 08:46:04 crc kubenswrapper[4903]: I0320 08:46:04.994806 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7b9df974b5-rb8w6" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": read tcp 10.217.0.2:35814->10.217.0.155:9696: read: connection reset by peer" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.034306 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566606-6n2pc" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.098914 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-public-tls-certs\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.102099 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-combined-ca-bundle\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.102173 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-config\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.102206 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-httpd-config\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.102244 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smrlj\" (UniqueName: \"kubernetes.io/projected/0790ef46-b8b6-4d5e-98a8-06319c232264-kube-api-access-smrlj\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.102385 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-internal-tls-certs\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.102445 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-ovndb-tls-certs\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.204122 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvnpm\" (UniqueName: \"kubernetes.io/projected/70a601b8-5c2b-4b68-a9b1-d1434eab6965-kube-api-access-rvnpm\") pod \"70a601b8-5c2b-4b68-a9b1-d1434eab6965\" (UID: \"70a601b8-5c2b-4b68-a9b1-d1434eab6965\") " Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.204451 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-httpd-config\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.204486 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smrlj\" (UniqueName: \"kubernetes.io/projected/0790ef46-b8b6-4d5e-98a8-06319c232264-kube-api-access-smrlj\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.204596 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-internal-tls-certs\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.204641 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-ovndb-tls-certs\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.204676 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-public-tls-certs\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.204732 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-combined-ca-bundle\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.204755 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-config\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.211834 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a601b8-5c2b-4b68-a9b1-d1434eab6965-kube-api-access-rvnpm" (OuterVolumeSpecName: "kube-api-access-rvnpm") pod "70a601b8-5c2b-4b68-a9b1-d1434eab6965" (UID: "70a601b8-5c2b-4b68-a9b1-d1434eab6965"). InnerVolumeSpecName "kube-api-access-rvnpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.213088 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-ovndb-tls-certs\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.213615 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-config\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.213754 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-public-tls-certs\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.214484 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-httpd-config\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.216940 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-combined-ca-bundle\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.222276 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-internal-tls-certs\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.235551 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smrlj\" (UniqueName: \"kubernetes.io/projected/0790ef46-b8b6-4d5e-98a8-06319c232264-kube-api-access-smrlj\") pod \"neutron-fbdb5-dn8w8\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.240232 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.306798 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvnpm\" (UniqueName: \"kubernetes.io/projected/70a601b8-5c2b-4b68-a9b1-d1434eab6965-kube-api-access-rvnpm\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.522705 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" path="/var/lib/kubelet/pods/079887bb-d4be-4d23-bf74-3332bfd2f7cb/volumes" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.556910 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566606-6n2pc" event={"ID":"70a601b8-5c2b-4b68-a9b1-d1434eab6965","Type":"ContainerDied","Data":"80cc914951c73153c76b2048acdd1cf15c6ab8c3747b9acc00f0046794552a86"} Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.556962 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80cc914951c73153c76b2048acdd1cf15c6ab8c3747b9acc00f0046794552a86" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.557072 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566606-6n2pc" Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.567444 4903 generic.go:334] "Generic (PLEG): container finished" podID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerID="a80ef4791d38f5c70e3b66329fee2898f3f9305859e05349a8ed7e0e8454b362" exitCode=0 Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.567557 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9df974b5-rb8w6" event={"ID":"3e64d519-16d2-48d3-8683-9da61bd19e2d","Type":"ContainerDied","Data":"a80ef4791d38f5c70e3b66329fee2898f3f9305859e05349a8ed7e0e8454b362"} Mar 20 08:46:05 crc kubenswrapper[4903]: I0320 08:46:05.796172 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fbdb5-dn8w8"] Mar 20 08:46:05 crc kubenswrapper[4903]: W0320 08:46:05.802308 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0790ef46_b8b6_4d5e_98a8_06319c232264.slice/crio-8e4f58e3b4585e11586634c29cf539f6d18961de4f85aa15be6d736846a8f499 WatchSource:0}: Error finding container 8e4f58e3b4585e11586634c29cf539f6d18961de4f85aa15be6d736846a8f499: Status 404 returned error can't find the container with id 8e4f58e3b4585e11586634c29cf539f6d18961de4f85aa15be6d736846a8f499 Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.109869 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566600-hz2sg"] Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.126023 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566600-hz2sg"] Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.270127 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.270196 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.579898 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b603178f-e90b-4e08-ad82-0d15ddc32844","Type":"ContainerStarted","Data":"3de4e0ca36bed2a2e1297655dc8fccc869c47de254efd321c197a432fa840a10"} Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.580411 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.582558 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdb5-dn8w8" event={"ID":"0790ef46-b8b6-4d5e-98a8-06319c232264","Type":"ContainerStarted","Data":"e720dfe8b4aae682033533f903e6ba534df6bbb629a2136de2d974617f1cbb66"} Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.582584 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdb5-dn8w8" event={"ID":"0790ef46-b8b6-4d5e-98a8-06319c232264","Type":"ContainerStarted","Data":"492d0160f992341b5c4f630fecea542dfc91408228e35e56c2fb187fb62dd5de"} Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.582619 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdb5-dn8w8" event={"ID":"0790ef46-b8b6-4d5e-98a8-06319c232264","Type":"ContainerStarted","Data":"8e4f58e3b4585e11586634c29cf539f6d18961de4f85aa15be6d736846a8f499"} Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.582722 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.615713 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.984432371 podStartE2EDuration="7.615691499s" podCreationTimestamp="2026-03-20 08:45:59 +0000 UTC" firstStartedPulling="2026-03-20 08:46:00.747110435 +0000 UTC m=+1385.964010750" lastFinishedPulling="2026-03-20 08:46:05.378369563 +0000 UTC m=+1390.595269878" observedRunningTime="2026-03-20 08:46:06.611553002 +0000 UTC m=+1391.828453327" watchObservedRunningTime="2026-03-20 08:46:06.615691499 +0000 UTC m=+1391.832591814" Mar 20 08:46:06 crc kubenswrapper[4903]: I0320 08:46:06.642218 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fbdb5-dn8w8" podStartSLOduration=2.642196565 podStartE2EDuration="2.642196565s" podCreationTimestamp="2026-03-20 08:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:46:06.631640079 +0000 UTC m=+1391.848540404" watchObservedRunningTime="2026-03-20 08:46:06.642196565 +0000 UTC m=+1391.859096880" Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.150302 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.236840 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-qfmx5"] Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.237116 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" podUID="8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" containerName="dnsmasq-dns" containerID="cri-o://4917ba93045126a4576095ac35dd6c5a6ea23ae385611d7fc0c26007e0dde166" gracePeriod=10 Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.252761 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.332300 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-v6rfn" podUID="d46db123-b34a-40fe-b42d-846f9788745b" containerName="registry-server" probeResult="failure" output=< Mar 20 08:46:07 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Mar 20 08:46:07 crc kubenswrapper[4903]: > Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.505043 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bac3c78-5b74-45b3-99a8-72d6f8b75d6c" path="/var/lib/kubelet/pods/5bac3c78-5b74-45b3-99a8-72d6f8b75d6c/volumes" Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.550574 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7b9df974b5-rb8w6" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": dial tcp 10.217.0.155:9696: connect: connection refused" Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.562308 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.618516 4903 generic.go:334] "Generic (PLEG): container finished" podID="8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" containerID="4917ba93045126a4576095ac35dd6c5a6ea23ae385611d7fc0c26007e0dde166" exitCode=0 Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.620335 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" event={"ID":"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58","Type":"ContainerDied","Data":"4917ba93045126a4576095ac35dd6c5a6ea23ae385611d7fc0c26007e0dde166"} Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.711778 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:46:07 crc kubenswrapper[4903]: I0320 08:46:07.926253 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.092757 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-swift-storage-0\") pod \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.092924 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-sb\") pod \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.093012 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f26x5\" (UniqueName: \"kubernetes.io/projected/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-kube-api-access-f26x5\") pod \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.093056 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-nb\") pod \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.093080 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-svc\") pod \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.093372 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-config\") pod \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\" (UID: \"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58\") " Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.099938 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-kube-api-access-f26x5" (OuterVolumeSpecName: "kube-api-access-f26x5") pod "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" (UID: "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58"). InnerVolumeSpecName "kube-api-access-f26x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.142856 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" (UID: "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.146568 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" (UID: "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.155124 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" (UID: "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.159242 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-config" (OuterVolumeSpecName: "config") pod "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" (UID: "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.163921 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" (UID: "8946b4f4-2d1b-44b8-9f54-ec31fe82ac58"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.196987 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.197062 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f26x5\" (UniqueName: \"kubernetes.io/projected/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-kube-api-access-f26x5\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.197080 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.197091 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.197103 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.197114 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.330212 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.331299 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75dfbf8d4b-8vdlx" podUID="079887bb-d4be-4d23-bf74-3332bfd2f7cb" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.628897 4903 generic.go:334] "Generic (PLEG): container finished" podID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerID="637459508e1cd8681366e71e955fbcb1a32426e408ddaa26278208437c4f5836" exitCode=0 Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.629007 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9df974b5-rb8w6" event={"ID":"3e64d519-16d2-48d3-8683-9da61bd19e2d","Type":"ContainerDied","Data":"637459508e1cd8681366e71e955fbcb1a32426e408ddaa26278208437c4f5836"} Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.632241 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" event={"ID":"8946b4f4-2d1b-44b8-9f54-ec31fe82ac58","Type":"ContainerDied","Data":"d73153e2593a92dafe655325ce26f0dde10c73923a32d2d03f31a40267ba46a5"} Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.632322 4903 scope.go:117] "RemoveContainer" containerID="4917ba93045126a4576095ac35dd6c5a6ea23ae385611d7fc0c26007e0dde166" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.632263 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-qfmx5" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.632402 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" containerName="probe" containerID="cri-o://933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7" gracePeriod=30 Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.632331 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" containerName="cinder-scheduler" containerID="cri-o://03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd" gracePeriod=30 Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.677562 4903 scope.go:117] "RemoveContainer" containerID="a3bb6d417d676ee314560e777802fd468275b6c788eae6b960a25872b3a735de" Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.682998 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-qfmx5"] Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.699815 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-qfmx5"] Mar 20 08:46:08 crc kubenswrapper[4903]: I0320 08:46:08.964063 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.116678 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-config\") pod \"3e64d519-16d2-48d3-8683-9da61bd19e2d\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.117260 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgvxj\" (UniqueName: \"kubernetes.io/projected/3e64d519-16d2-48d3-8683-9da61bd19e2d-kube-api-access-xgvxj\") pod \"3e64d519-16d2-48d3-8683-9da61bd19e2d\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.118093 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-public-tls-certs\") pod \"3e64d519-16d2-48d3-8683-9da61bd19e2d\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.118147 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-ovndb-tls-certs\") pod \"3e64d519-16d2-48d3-8683-9da61bd19e2d\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.118202 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-httpd-config\") pod \"3e64d519-16d2-48d3-8683-9da61bd19e2d\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.118321 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-combined-ca-bundle\") pod \"3e64d519-16d2-48d3-8683-9da61bd19e2d\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.118393 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-internal-tls-certs\") pod \"3e64d519-16d2-48d3-8683-9da61bd19e2d\" (UID: \"3e64d519-16d2-48d3-8683-9da61bd19e2d\") " Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.134678 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e64d519-16d2-48d3-8683-9da61bd19e2d-kube-api-access-xgvxj" (OuterVolumeSpecName: "kube-api-access-xgvxj") pod "3e64d519-16d2-48d3-8683-9da61bd19e2d" (UID: "3e64d519-16d2-48d3-8683-9da61bd19e2d"). InnerVolumeSpecName "kube-api-access-xgvxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.137026 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "3e64d519-16d2-48d3-8683-9da61bd19e2d" (UID: "3e64d519-16d2-48d3-8683-9da61bd19e2d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.181125 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e64d519-16d2-48d3-8683-9da61bd19e2d" (UID: "3e64d519-16d2-48d3-8683-9da61bd19e2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.183779 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e64d519-16d2-48d3-8683-9da61bd19e2d" (UID: "3e64d519-16d2-48d3-8683-9da61bd19e2d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.190702 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e64d519-16d2-48d3-8683-9da61bd19e2d" (UID: "3e64d519-16d2-48d3-8683-9da61bd19e2d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.198881 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-config" (OuterVolumeSpecName: "config") pod "3e64d519-16d2-48d3-8683-9da61bd19e2d" (UID: "3e64d519-16d2-48d3-8683-9da61bd19e2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.209588 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "3e64d519-16d2-48d3-8683-9da61bd19e2d" (UID: "3e64d519-16d2-48d3-8683-9da61bd19e2d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.220327 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.220384 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.220396 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.220410 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgvxj\" (UniqueName: \"kubernetes.io/projected/3e64d519-16d2-48d3-8683-9da61bd19e2d-kube-api-access-xgvxj\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.220421 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.220432 4903 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.220443 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3e64d519-16d2-48d3-8683-9da61bd19e2d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.504827 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" path="/var/lib/kubelet/pods/8946b4f4-2d1b-44b8-9f54-ec31fe82ac58/volumes" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.642326 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b9df974b5-rb8w6" event={"ID":"3e64d519-16d2-48d3-8683-9da61bd19e2d","Type":"ContainerDied","Data":"34d8004fc6e7c7aa2bef610216ba05203ea0e25d6015c94bd65e30c74e0bc95e"} Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.642386 4903 scope.go:117] "RemoveContainer" containerID="a80ef4791d38f5c70e3b66329fee2898f3f9305859e05349a8ed7e0e8454b362" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.642521 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b9df974b5-rb8w6" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.651475 4903 generic.go:334] "Generic (PLEG): container finished" podID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" containerID="933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7" exitCode=0 Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.651520 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e","Type":"ContainerDied","Data":"933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7"} Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.675648 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b9df974b5-rb8w6"] Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.682554 4903 scope.go:117] "RemoveContainer" containerID="637459508e1cd8681366e71e955fbcb1a32426e408ddaa26278208437c4f5836" Mar 20 08:46:09 crc kubenswrapper[4903]: I0320 08:46:09.686169 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7b9df974b5-rb8w6"] Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.318116 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.366454 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.709486 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-669bcbb856-w87fq"] Mar 20 08:46:10 crc kubenswrapper[4903]: E0320 08:46:10.710444 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a601b8-5c2b-4b68-a9b1-d1434eab6965" containerName="oc" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.710461 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a601b8-5c2b-4b68-a9b1-d1434eab6965" containerName="oc" Mar 20 08:46:10 crc kubenswrapper[4903]: E0320 08:46:10.710519 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerName="neutron-api" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.710533 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerName="neutron-api" Mar 20 08:46:10 crc kubenswrapper[4903]: E0320 08:46:10.710556 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerName="neutron-httpd" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.710565 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerName="neutron-httpd" Mar 20 08:46:10 crc kubenswrapper[4903]: E0320 08:46:10.710586 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" containerName="init" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.710592 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" containerName="init" Mar 20 08:46:10 crc kubenswrapper[4903]: E0320 08:46:10.710605 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" containerName="dnsmasq-dns" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.710611 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" containerName="dnsmasq-dns" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.711177 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a601b8-5c2b-4b68-a9b1-d1434eab6965" containerName="oc" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.711239 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8946b4f4-2d1b-44b8-9f54-ec31fe82ac58" containerName="dnsmasq-dns" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.711257 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerName="neutron-api" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.711279 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" containerName="neutron-httpd" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.713332 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.744174 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-669bcbb856-w87fq"] Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.748991 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp88c\" (UniqueName: \"kubernetes.io/projected/1dcd96a1-71bb-480c-8387-0fca4d17bf33-kube-api-access-wp88c\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.749085 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-public-tls-certs\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.749106 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-config-data\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.749157 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dcd96a1-71bb-480c-8387-0fca4d17bf33-logs\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.749188 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-scripts\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.749231 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-combined-ca-bundle\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.749327 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-internal-tls-certs\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.851564 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-scripts\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.851645 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-combined-ca-bundle\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.851751 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-internal-tls-certs\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.851786 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp88c\" (UniqueName: \"kubernetes.io/projected/1dcd96a1-71bb-480c-8387-0fca4d17bf33-kube-api-access-wp88c\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.851837 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-public-tls-certs\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.851861 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-config-data\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.851908 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dcd96a1-71bb-480c-8387-0fca4d17bf33-logs\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.852443 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dcd96a1-71bb-480c-8387-0fca4d17bf33-logs\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.856302 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-combined-ca-bundle\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.856766 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-scripts\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.860083 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-config-data\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.861717 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-internal-tls-certs\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.876351 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-public-tls-certs\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:10 crc kubenswrapper[4903]: I0320 08:46:10.897710 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp88c\" (UniqueName: \"kubernetes.io/projected/1dcd96a1-71bb-480c-8387-0fca4d17bf33-kube-api-access-wp88c\") pod \"placement-669bcbb856-w87fq\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:11 crc kubenswrapper[4903]: I0320 08:46:11.055331 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:11 crc kubenswrapper[4903]: I0320 08:46:11.507777 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e64d519-16d2-48d3-8683-9da61bd19e2d" path="/var/lib/kubelet/pods/3e64d519-16d2-48d3-8683-9da61bd19e2d/volumes" Mar 20 08:46:11 crc kubenswrapper[4903]: I0320 08:46:11.561106 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-669bcbb856-w87fq"] Mar 20 08:46:11 crc kubenswrapper[4903]: I0320 08:46:11.673306 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-669bcbb856-w87fq" event={"ID":"1dcd96a1-71bb-480c-8387-0fca4d17bf33","Type":"ContainerStarted","Data":"265057ba6e83fa8c78566dd69b864204ce6db94495ba7162cf5ee90e9aac31a6"} Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.287163 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.484941 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-scripts\") pod \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.485018 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-etc-machine-id\") pod \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.485117 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x8sf\" (UniqueName: \"kubernetes.io/projected/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-kube-api-access-8x8sf\") pod \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.485264 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" (UID: "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.486189 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data\") pod \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.486227 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data-custom\") pod \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.486264 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-combined-ca-bundle\") pod \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\" (UID: \"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e\") " Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.486606 4903 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.493511 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-kube-api-access-8x8sf" (OuterVolumeSpecName: "kube-api-access-8x8sf") pod "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" (UID: "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e"). InnerVolumeSpecName "kube-api-access-8x8sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.493572 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-scripts" (OuterVolumeSpecName: "scripts") pod "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" (UID: "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.500244 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" (UID: "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.544802 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" (UID: "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.589749 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x8sf\" (UniqueName: \"kubernetes.io/projected/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-kube-api-access-8x8sf\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.589802 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.589812 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.589822 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.593604 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data" (OuterVolumeSpecName: "config-data") pod "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" (UID: "c84a69c5-84f8-42d3-a383-4f1c5eb17b5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.685341 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-669bcbb856-w87fq" event={"ID":"1dcd96a1-71bb-480c-8387-0fca4d17bf33","Type":"ContainerStarted","Data":"10f2b77e4d99df665d6349b311dc9b4cfe076067c636391f3d7e6e34202c3750"} Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.685421 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-669bcbb856-w87fq" event={"ID":"1dcd96a1-71bb-480c-8387-0fca4d17bf33","Type":"ContainerStarted","Data":"54b0f1f5dfa2a405752216bff60fa798790887221bd217f21ee213ca02e2318b"} Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.685542 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.692413 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.696913 4903 generic.go:334] "Generic (PLEG): container finished" podID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" containerID="03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd" exitCode=0 Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.696972 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e","Type":"ContainerDied","Data":"03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd"} Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.697012 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.697025 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c84a69c5-84f8-42d3-a383-4f1c5eb17b5e","Type":"ContainerDied","Data":"a0c47778c1d6928c802f192c1be4632e4a3fd7b53e8bb58217093e5add3d252f"} Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.697080 4903 scope.go:117] "RemoveContainer" containerID="933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.717497 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-669bcbb856-w87fq" podStartSLOduration=2.717476491 podStartE2EDuration="2.717476491s" podCreationTimestamp="2026-03-20 08:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:46:12.711437089 +0000 UTC m=+1397.928337414" watchObservedRunningTime="2026-03-20 08:46:12.717476491 +0000 UTC m=+1397.934376806" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.755129 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.763356 4903 scope.go:117] "RemoveContainer" containerID="03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.772984 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.782532 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:46:12 crc kubenswrapper[4903]: E0320 08:46:12.783168 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" containerName="probe" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.783192 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" containerName="probe" Mar 20 08:46:12 crc kubenswrapper[4903]: E0320 08:46:12.783219 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" containerName="cinder-scheduler" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.783226 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" containerName="cinder-scheduler" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.783432 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" containerName="probe" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.783452 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" containerName="cinder-scheduler" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.784728 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.788753 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.788980 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.790382 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.790555 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fm82s" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.792613 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.793947 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcqnr\" (UniqueName: \"kubernetes.io/projected/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-kube-api-access-jcqnr\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.793995 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.794176 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.794209 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.794232 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-scripts\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.794262 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.801145 4903 scope.go:117] "RemoveContainer" containerID="933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7" Mar 20 08:46:12 crc kubenswrapper[4903]: E0320 08:46:12.802165 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7\": container with ID starting with 933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7 not found: ID does not exist" containerID="933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.802197 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7"} err="failed to get container status \"933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7\": rpc error: code = NotFound desc = could not find container \"933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7\": container with ID starting with 933978e2906da038007c8ad14bd46d4be5b953876f55ae360bb56b911f566bc7 not found: ID does not exist" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.802218 4903 scope.go:117] "RemoveContainer" containerID="03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd" Mar 20 08:46:12 crc kubenswrapper[4903]: E0320 08:46:12.802732 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd\": container with ID starting with 03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd not found: ID does not exist" containerID="03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.802786 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd"} err="failed to get container status \"03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd\": rpc error: code = NotFound desc = could not find container \"03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd\": container with ID starting with 03ad1247b852e4e4690a886d8f35bed0d3d3270db41df1c55347d0c9b234a7fd not found: ID does not exist" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.896420 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.896761 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.896790 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.896817 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-scripts\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.896854 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.896877 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.896917 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcqnr\" (UniqueName: \"kubernetes.io/projected/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-kube-api-access-jcqnr\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.900656 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.901566 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-scripts\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.903835 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.904384 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:12 crc kubenswrapper[4903]: I0320 08:46:12.916432 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcqnr\" (UniqueName: \"kubernetes.io/projected/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-kube-api-access-jcqnr\") pod \"cinder-scheduler-0\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " pod="openstack/cinder-scheduler-0" Mar 20 08:46:13 crc kubenswrapper[4903]: I0320 08:46:13.104715 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:46:13 crc kubenswrapper[4903]: I0320 08:46:13.503412 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c84a69c5-84f8-42d3-a383-4f1c5eb17b5e" path="/var/lib/kubelet/pods/c84a69c5-84f8-42d3-a383-4f1c5eb17b5e/volumes" Mar 20 08:46:13 crc kubenswrapper[4903]: I0320 08:46:13.625379 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:46:13 crc kubenswrapper[4903]: I0320 08:46:13.710815 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a","Type":"ContainerStarted","Data":"63c548ca09d2a2f21c2d5520463bb11e088324de770769e3a18e10fd91b04979"} Mar 20 08:46:13 crc kubenswrapper[4903]: I0320 08:46:13.717246 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:14 crc kubenswrapper[4903]: I0320 08:46:14.299883 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:46:14 crc kubenswrapper[4903]: I0320 08:46:14.739646 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a","Type":"ContainerStarted","Data":"cb7bb133830dc4bdac2cbea0a778366365297bc4c7bc623ae8334776f5496711"} Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.450069 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.452836 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.453888 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a","Type":"ContainerStarted","Data":"4cb95c6e5180c8a94f8474533f104c8ca5edf99bbf830ed9f71c73b944a44ab1"} Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.458178 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.458299 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.461072 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.462097 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-5dlw5" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.489401 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config-secret\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.489499 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7vw\" (UniqueName: \"kubernetes.io/projected/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-kube-api-access-xc7vw\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.490205 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.491310 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.495835 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.525836 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.525812188 podStartE2EDuration="4.525812188s" podCreationTimestamp="2026-03-20 08:46:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:46:16.507692677 +0000 UTC m=+1401.724593012" watchObservedRunningTime="2026-03-20 08:46:16.525812188 +0000 UTC m=+1401.742712503" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.561751 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.593628 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.593831 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.593864 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config-secret\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.593882 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7vw\" (UniqueName: \"kubernetes.io/projected/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-kube-api-access-xc7vw\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.596567 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.602339 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config-secret\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.606862 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.615129 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7vw\" (UniqueName: \"kubernetes.io/projected/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-kube-api-access-xc7vw\") pod \"openstackclient\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " pod="openstack/openstackclient" Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.768813 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v6rfn"] Mar 20 08:46:16 crc kubenswrapper[4903]: I0320 08:46:16.784736 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:46:17 crc kubenswrapper[4903]: I0320 08:46:17.244057 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 08:46:17 crc kubenswrapper[4903]: W0320 08:46:17.254991 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4819c8dc_535a_4fb2_93ed_16eccdf8cd6c.slice/crio-91658b353000f5e497bf8fb0a0656b97511356935a53e3fa9abacff9dd41b4d1 WatchSource:0}: Error finding container 91658b353000f5e497bf8fb0a0656b97511356935a53e3fa9abacff9dd41b4d1: Status 404 returned error can't find the container with id 91658b353000f5e497bf8fb0a0656b97511356935a53e3fa9abacff9dd41b4d1 Mar 20 08:46:17 crc kubenswrapper[4903]: I0320 08:46:17.466117 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c","Type":"ContainerStarted","Data":"91658b353000f5e497bf8fb0a0656b97511356935a53e3fa9abacff9dd41b4d1"} Mar 20 08:46:18 crc kubenswrapper[4903]: I0320 08:46:18.105683 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 08:46:18 crc kubenswrapper[4903]: I0320 08:46:18.474011 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v6rfn" podUID="d46db123-b34a-40fe-b42d-846f9788745b" containerName="registry-server" containerID="cri-o://ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1" gracePeriod=2 Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.030177 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.146681 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-utilities\") pod \"d46db123-b34a-40fe-b42d-846f9788745b\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.146743 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-catalog-content\") pod \"d46db123-b34a-40fe-b42d-846f9788745b\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.146785 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmhht\" (UniqueName: \"kubernetes.io/projected/d46db123-b34a-40fe-b42d-846f9788745b-kube-api-access-zmhht\") pod \"d46db123-b34a-40fe-b42d-846f9788745b\" (UID: \"d46db123-b34a-40fe-b42d-846f9788745b\") " Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.147573 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-utilities" (OuterVolumeSpecName: "utilities") pod "d46db123-b34a-40fe-b42d-846f9788745b" (UID: "d46db123-b34a-40fe-b42d-846f9788745b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.159194 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46db123-b34a-40fe-b42d-846f9788745b-kube-api-access-zmhht" (OuterVolumeSpecName: "kube-api-access-zmhht") pod "d46db123-b34a-40fe-b42d-846f9788745b" (UID: "d46db123-b34a-40fe-b42d-846f9788745b"). InnerVolumeSpecName "kube-api-access-zmhht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.202383 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d46db123-b34a-40fe-b42d-846f9788745b" (UID: "d46db123-b34a-40fe-b42d-846f9788745b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.249142 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.249179 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d46db123-b34a-40fe-b42d-846f9788745b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.249190 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmhht\" (UniqueName: \"kubernetes.io/projected/d46db123-b34a-40fe-b42d-846f9788745b-kube-api-access-zmhht\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.526274 4903 generic.go:334] "Generic (PLEG): container finished" podID="d46db123-b34a-40fe-b42d-846f9788745b" containerID="ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1" exitCode=0 Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.526339 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6rfn" event={"ID":"d46db123-b34a-40fe-b42d-846f9788745b","Type":"ContainerDied","Data":"ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1"} Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.526375 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6rfn" event={"ID":"d46db123-b34a-40fe-b42d-846f9788745b","Type":"ContainerDied","Data":"1ee24c6bf81756a537defbc2f18475f633ba90bcbf60b8f5bc349a20cb2df27f"} Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.526398 4903 scope.go:117] "RemoveContainer" containerID="ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.526434 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6rfn" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.553697 4903 scope.go:117] "RemoveContainer" containerID="bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.576249 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v6rfn"] Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.592850 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v6rfn"] Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.606311 4903 scope.go:117] "RemoveContainer" containerID="75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.658216 4903 scope.go:117] "RemoveContainer" containerID="ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1" Mar 20 08:46:19 crc kubenswrapper[4903]: E0320 08:46:19.659205 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1\": container with ID starting with ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1 not found: ID does not exist" containerID="ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.659366 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1"} err="failed to get container status \"ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1\": rpc error: code = NotFound desc = could not find container \"ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1\": container with ID starting with ae46b4580a36885cdcf1dc77ba729ae54e06d4d09e730e14c8aa313785aeb4d1 not found: ID does not exist" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.659517 4903 scope.go:117] "RemoveContainer" containerID="bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae" Mar 20 08:46:19 crc kubenswrapper[4903]: E0320 08:46:19.660093 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae\": container with ID starting with bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae not found: ID does not exist" containerID="bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.660204 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae"} err="failed to get container status \"bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae\": rpc error: code = NotFound desc = could not find container \"bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae\": container with ID starting with bb3312fa476cc84d99e27b5551633a1e7331928c98574cfc7407ff05ac2436ae not found: ID does not exist" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.660303 4903 scope.go:117] "RemoveContainer" containerID="75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9" Mar 20 08:46:19 crc kubenswrapper[4903]: E0320 08:46:19.660668 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9\": container with ID starting with 75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9 not found: ID does not exist" containerID="75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9" Mar 20 08:46:19 crc kubenswrapper[4903]: I0320 08:46:19.660716 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9"} err="failed to get container status \"75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9\": rpc error: code = NotFound desc = could not find container \"75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9\": container with ID starting with 75395f893b48be59eb3ba0cd9eb0f5972519ef936a08c8edeb0b04a0d9bbbce9 not found: ID does not exist" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.316791 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-69dc7db475-m968g"] Mar 20 08:46:20 crc kubenswrapper[4903]: E0320 08:46:20.317721 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46db123-b34a-40fe-b42d-846f9788745b" containerName="extract-utilities" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.317809 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46db123-b34a-40fe-b42d-846f9788745b" containerName="extract-utilities" Mar 20 08:46:20 crc kubenswrapper[4903]: E0320 08:46:20.317916 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46db123-b34a-40fe-b42d-846f9788745b" containerName="extract-content" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.318137 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46db123-b34a-40fe-b42d-846f9788745b" containerName="extract-content" Mar 20 08:46:20 crc kubenswrapper[4903]: E0320 08:46:20.318247 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46db123-b34a-40fe-b42d-846f9788745b" containerName="registry-server" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.318321 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46db123-b34a-40fe-b42d-846f9788745b" containerName="registry-server" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.318683 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46db123-b34a-40fe-b42d-846f9788745b" containerName="registry-server" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.319953 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.324351 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.324727 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.325091 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.344098 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-69dc7db475-m968g"] Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.379266 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-config-data\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.379308 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-log-httpd\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.379416 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-internal-tls-certs\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.379441 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p7vz\" (UniqueName: \"kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-kube-api-access-9p7vz\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.379504 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-combined-ca-bundle\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.379574 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-etc-swift\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.379608 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-public-tls-certs\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.379628 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-run-httpd\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.481139 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-internal-tls-certs\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.481381 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p7vz\" (UniqueName: \"kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-kube-api-access-9p7vz\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.481516 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-combined-ca-bundle\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.481627 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-etc-swift\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.481729 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-public-tls-certs\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.481820 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-run-httpd\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.481926 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-config-data\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.482006 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-log-httpd\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.482592 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-log-httpd\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.485661 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-run-httpd\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.487704 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-public-tls-certs\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.487927 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-internal-tls-certs\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.488650 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-combined-ca-bundle\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.489982 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-config-data\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.491248 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-etc-swift\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.513898 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p7vz\" (UniqueName: \"kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-kube-api-access-9p7vz\") pod \"swift-proxy-69dc7db475-m968g\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:20 crc kubenswrapper[4903]: I0320 08:46:20.642255 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:21 crc kubenswrapper[4903]: W0320 08:46:21.499228 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e960802_5c0e_4800_853f_e23466958aec.slice/crio-2a70810d257fa13ad9dfd79a6bcf06a2ef6a3ede083f1f92c82d401d3e420d31 WatchSource:0}: Error finding container 2a70810d257fa13ad9dfd79a6bcf06a2ef6a3ede083f1f92c82d401d3e420d31: Status 404 returned error can't find the container with id 2a70810d257fa13ad9dfd79a6bcf06a2ef6a3ede083f1f92c82d401d3e420d31 Mar 20 08:46:21 crc kubenswrapper[4903]: I0320 08:46:21.506074 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46db123-b34a-40fe-b42d-846f9788745b" path="/var/lib/kubelet/pods/d46db123-b34a-40fe-b42d-846f9788745b/volumes" Mar 20 08:46:21 crc kubenswrapper[4903]: I0320 08:46:21.506728 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-69dc7db475-m968g"] Mar 20 08:46:21 crc kubenswrapper[4903]: I0320 08:46:21.550014 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69dc7db475-m968g" event={"ID":"5e960802-5c0e-4800-853f-e23466958aec","Type":"ContainerStarted","Data":"2a70810d257fa13ad9dfd79a6bcf06a2ef6a3ede083f1f92c82d401d3e420d31"} Mar 20 08:46:22 crc kubenswrapper[4903]: I0320 08:46:22.569366 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69dc7db475-m968g" event={"ID":"5e960802-5c0e-4800-853f-e23466958aec","Type":"ContainerStarted","Data":"2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a"} Mar 20 08:46:22 crc kubenswrapper[4903]: I0320 08:46:22.571436 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:22 crc kubenswrapper[4903]: I0320 08:46:22.571454 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69dc7db475-m968g" event={"ID":"5e960802-5c0e-4800-853f-e23466958aec","Type":"ContainerStarted","Data":"55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188"} Mar 20 08:46:22 crc kubenswrapper[4903]: I0320 08:46:22.601631 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-69dc7db475-m968g" podStartSLOduration=2.6016076740000003 podStartE2EDuration="2.601607674s" podCreationTimestamp="2026-03-20 08:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:46:22.589704537 +0000 UTC m=+1407.806604852" watchObservedRunningTime="2026-03-20 08:46:22.601607674 +0000 UTC m=+1407.818507989" Mar 20 08:46:23 crc kubenswrapper[4903]: I0320 08:46:23.530186 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 08:46:23 crc kubenswrapper[4903]: I0320 08:46:23.583250 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:23 crc kubenswrapper[4903]: I0320 08:46:23.993757 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:23 crc kubenswrapper[4903]: I0320 08:46:23.994225 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="ceilometer-central-agent" containerID="cri-o://dc3f13562bfffe676ba3cb75ad7d3e2e26c1a2b70e23596ae455c73f5af5d8bf" gracePeriod=30 Mar 20 08:46:23 crc kubenswrapper[4903]: I0320 08:46:23.994358 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="ceilometer-notification-agent" containerID="cri-o://b14c8d93b7883541bc880c20a4c3da5ee1c79e090d432fcaa5c61c14083dee3d" gracePeriod=30 Mar 20 08:46:23 crc kubenswrapper[4903]: I0320 08:46:23.994361 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="proxy-httpd" containerID="cri-o://3de4e0ca36bed2a2e1297655dc8fccc869c47de254efd321c197a432fa840a10" gracePeriod=30 Mar 20 08:46:23 crc kubenswrapper[4903]: I0320 08:46:23.994616 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="sg-core" containerID="cri-o://23de93f09c8f56d1d5ad7e078619ee8d784563bdc9c71699161a829e7408c456" gracePeriod=30 Mar 20 08:46:23 crc kubenswrapper[4903]: I0320 08:46:23.999405 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 08:46:24 crc kubenswrapper[4903]: I0320 08:46:24.602078 4903 generic.go:334] "Generic (PLEG): container finished" podID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerID="3de4e0ca36bed2a2e1297655dc8fccc869c47de254efd321c197a432fa840a10" exitCode=0 Mar 20 08:46:24 crc kubenswrapper[4903]: I0320 08:46:24.602116 4903 generic.go:334] "Generic (PLEG): container finished" podID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerID="23de93f09c8f56d1d5ad7e078619ee8d784563bdc9c71699161a829e7408c456" exitCode=2 Mar 20 08:46:24 crc kubenswrapper[4903]: I0320 08:46:24.602125 4903 generic.go:334] "Generic (PLEG): container finished" podID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerID="b14c8d93b7883541bc880c20a4c3da5ee1c79e090d432fcaa5c61c14083dee3d" exitCode=0 Mar 20 08:46:24 crc kubenswrapper[4903]: I0320 08:46:24.602134 4903 generic.go:334] "Generic (PLEG): container finished" podID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerID="dc3f13562bfffe676ba3cb75ad7d3e2e26c1a2b70e23596ae455c73f5af5d8bf" exitCode=0 Mar 20 08:46:24 crc kubenswrapper[4903]: I0320 08:46:24.602161 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b603178f-e90b-4e08-ad82-0d15ddc32844","Type":"ContainerDied","Data":"3de4e0ca36bed2a2e1297655dc8fccc869c47de254efd321c197a432fa840a10"} Mar 20 08:46:24 crc kubenswrapper[4903]: I0320 08:46:24.602213 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b603178f-e90b-4e08-ad82-0d15ddc32844","Type":"ContainerDied","Data":"23de93f09c8f56d1d5ad7e078619ee8d784563bdc9c71699161a829e7408c456"} Mar 20 08:46:24 crc kubenswrapper[4903]: I0320 08:46:24.602226 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b603178f-e90b-4e08-ad82-0d15ddc32844","Type":"ContainerDied","Data":"b14c8d93b7883541bc880c20a4c3da5ee1c79e090d432fcaa5c61c14083dee3d"} Mar 20 08:46:24 crc kubenswrapper[4903]: I0320 08:46:24.602236 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b603178f-e90b-4e08-ad82-0d15ddc32844","Type":"ContainerDied","Data":"dc3f13562bfffe676ba3cb75ad7d3e2e26c1a2b70e23596ae455c73f5af5d8bf"} Mar 20 08:46:25 crc kubenswrapper[4903]: I0320 08:46:25.549245 4903 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podb26d1664-feac-422f-a058-7e4f798a7b45"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podb26d1664-feac-422f-a058-7e4f798a7b45] : Timed out while waiting for systemd to remove kubepods-besteffort-podb26d1664_feac_422f_a058_7e4f798a7b45.slice" Mar 20 08:46:25 crc kubenswrapper[4903]: E0320 08:46:25.549598 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podb26d1664-feac-422f-a058-7e4f798a7b45] : unable to destroy cgroup paths for cgroup [kubepods besteffort podb26d1664-feac-422f-a058-7e4f798a7b45] : Timed out while waiting for systemd to remove kubepods-besteffort-podb26d1664_feac_422f_a058_7e4f798a7b45.slice" pod="openstack/barbican-worker-85dc675685-rvw97" podUID="b26d1664-feac-422f-a058-7e4f798a7b45" Mar 20 08:46:25 crc kubenswrapper[4903]: I0320 08:46:25.609700 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85dc675685-rvw97" Mar 20 08:46:25 crc kubenswrapper[4903]: I0320 08:46:25.658095 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-85dc675685-rvw97"] Mar 20 08:46:25 crc kubenswrapper[4903]: I0320 08:46:25.662693 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-85dc675685-rvw97"] Mar 20 08:46:27 crc kubenswrapper[4903]: I0320 08:46:27.511099 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26d1664-feac-422f-a058-7e4f798a7b45" path="/var/lib/kubelet/pods/b26d1664-feac-422f-a058-7e4f798a7b45/volumes" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.583666 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.660922 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c","Type":"ContainerStarted","Data":"476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f"} Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.665177 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b603178f-e90b-4e08-ad82-0d15ddc32844","Type":"ContainerDied","Data":"b61defaaf30760bc9cc66413aab61f92d86cba994535ce412604ccc05c9c6424"} Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.665224 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.665271 4903 scope.go:117] "RemoveContainer" containerID="3de4e0ca36bed2a2e1297655dc8fccc869c47de254efd321c197a432fa840a10" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.673350 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5nm7\" (UniqueName: \"kubernetes.io/projected/b603178f-e90b-4e08-ad82-0d15ddc32844-kube-api-access-g5nm7\") pod \"b603178f-e90b-4e08-ad82-0d15ddc32844\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.673464 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-combined-ca-bundle\") pod \"b603178f-e90b-4e08-ad82-0d15ddc32844\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.673599 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-run-httpd\") pod \"b603178f-e90b-4e08-ad82-0d15ddc32844\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.673675 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-log-httpd\") pod \"b603178f-e90b-4e08-ad82-0d15ddc32844\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.673724 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-sg-core-conf-yaml\") pod \"b603178f-e90b-4e08-ad82-0d15ddc32844\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.673931 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-scripts\") pod \"b603178f-e90b-4e08-ad82-0d15ddc32844\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.673969 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-config-data\") pod \"b603178f-e90b-4e08-ad82-0d15ddc32844\" (UID: \"b603178f-e90b-4e08-ad82-0d15ddc32844\") " Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.676225 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b603178f-e90b-4e08-ad82-0d15ddc32844" (UID: "b603178f-e90b-4e08-ad82-0d15ddc32844"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.676550 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b603178f-e90b-4e08-ad82-0d15ddc32844" (UID: "b603178f-e90b-4e08-ad82-0d15ddc32844"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.680443 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-scripts" (OuterVolumeSpecName: "scripts") pod "b603178f-e90b-4e08-ad82-0d15ddc32844" (UID: "b603178f-e90b-4e08-ad82-0d15ddc32844"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.682241 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.661520539 podStartE2EDuration="12.682216292s" podCreationTimestamp="2026-03-20 08:46:16 +0000 UTC" firstStartedPulling="2026-03-20 08:46:17.258393967 +0000 UTC m=+1402.475294282" lastFinishedPulling="2026-03-20 08:46:28.27908972 +0000 UTC m=+1413.495990035" observedRunningTime="2026-03-20 08:46:28.675643709 +0000 UTC m=+1413.892544024" watchObservedRunningTime="2026-03-20 08:46:28.682216292 +0000 UTC m=+1413.899116607" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.685197 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b603178f-e90b-4e08-ad82-0d15ddc32844-kube-api-access-g5nm7" (OuterVolumeSpecName: "kube-api-access-g5nm7") pod "b603178f-e90b-4e08-ad82-0d15ddc32844" (UID: "b603178f-e90b-4e08-ad82-0d15ddc32844"). InnerVolumeSpecName "kube-api-access-g5nm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.696032 4903 scope.go:117] "RemoveContainer" containerID="23de93f09c8f56d1d5ad7e078619ee8d784563bdc9c71699161a829e7408c456" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.722594 4903 scope.go:117] "RemoveContainer" containerID="b14c8d93b7883541bc880c20a4c3da5ee1c79e090d432fcaa5c61c14083dee3d" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.733750 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b603178f-e90b-4e08-ad82-0d15ddc32844" (UID: "b603178f-e90b-4e08-ad82-0d15ddc32844"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.748661 4903 scope.go:117] "RemoveContainer" containerID="dc3f13562bfffe676ba3cb75ad7d3e2e26c1a2b70e23596ae455c73f5af5d8bf" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.773087 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b603178f-e90b-4e08-ad82-0d15ddc32844" (UID: "b603178f-e90b-4e08-ad82-0d15ddc32844"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.776769 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.776800 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b603178f-e90b-4e08-ad82-0d15ddc32844-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.776811 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.776829 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.776839 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5nm7\" (UniqueName: \"kubernetes.io/projected/b603178f-e90b-4e08-ad82-0d15ddc32844-kube-api-access-g5nm7\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.776851 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.794725 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-config-data" (OuterVolumeSpecName: "config-data") pod "b603178f-e90b-4e08-ad82-0d15ddc32844" (UID: "b603178f-e90b-4e08-ad82-0d15ddc32844"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.878660 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b603178f-e90b-4e08-ad82-0d15ddc32844-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.996250 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mdxf5"] Mar 20 08:46:28 crc kubenswrapper[4903]: E0320 08:46:28.996994 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="sg-core" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.997021 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="sg-core" Mar 20 08:46:28 crc kubenswrapper[4903]: E0320 08:46:28.997034 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="ceilometer-notification-agent" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.997042 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="ceilometer-notification-agent" Mar 20 08:46:28 crc kubenswrapper[4903]: E0320 08:46:28.997137 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="proxy-httpd" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.997145 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="proxy-httpd" Mar 20 08:46:28 crc kubenswrapper[4903]: E0320 08:46:28.997187 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="ceilometer-central-agent" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.997194 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="ceilometer-central-agent" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.997393 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="ceilometer-notification-agent" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.997415 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="proxy-httpd" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.997425 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="ceilometer-central-agent" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.997439 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" containerName="sg-core" Mar 20 08:46:28 crc kubenswrapper[4903]: I0320 08:46:28.998318 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mdxf5" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.004507 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.020348 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.082981 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mdxf5"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.085666 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-operator-scripts\") pod \"nova-api-db-create-mdxf5\" (UID: \"c6fea6b7-8fd2-42c4-8016-334f6f69c22e\") " pod="openstack/nova-api-db-create-mdxf5" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.086674 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbmjk\" (UniqueName: \"kubernetes.io/projected/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-kube-api-access-kbmjk\") pod \"nova-api-db-create-mdxf5\" (UID: \"c6fea6b7-8fd2-42c4-8016-334f6f69c22e\") " pod="openstack/nova-api-db-create-mdxf5" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.121142 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.130082 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.133196 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.135398 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.158653 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.179492 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5rbfs"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.181451 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5rbfs" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.182116 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5rbfs"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190418 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-scripts\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190484 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-operator-scripts\") pod \"nova-api-db-create-mdxf5\" (UID: \"c6fea6b7-8fd2-42c4-8016-334f6f69c22e\") " pod="openstack/nova-api-db-create-mdxf5" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190506 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190561 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbmjk\" (UniqueName: \"kubernetes.io/projected/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-kube-api-access-kbmjk\") pod \"nova-api-db-create-mdxf5\" (UID: \"c6fea6b7-8fd2-42c4-8016-334f6f69c22e\") " pod="openstack/nova-api-db-create-mdxf5" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190587 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-run-httpd\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190633 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190656 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-config-data\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190710 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvn7g\" (UniqueName: \"kubernetes.io/projected/7e4d3d5e-2374-4583-8193-bcef8b16110e-kube-api-access-rvn7g\") pod \"nova-cell0-db-create-5rbfs\" (UID: \"7e4d3d5e-2374-4583-8193-bcef8b16110e\") " pod="openstack/nova-cell0-db-create-5rbfs" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190737 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlv8v\" (UniqueName: \"kubernetes.io/projected/4b11fccf-eea3-450d-b460-013086becb0c-kube-api-access-zlv8v\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190759 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e4d3d5e-2374-4583-8193-bcef8b16110e-operator-scripts\") pod \"nova-cell0-db-create-5rbfs\" (UID: \"7e4d3d5e-2374-4583-8193-bcef8b16110e\") " pod="openstack/nova-cell0-db-create-5rbfs" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.190781 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-log-httpd\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.191658 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-operator-scripts\") pod \"nova-api-db-create-mdxf5\" (UID: \"c6fea6b7-8fd2-42c4-8016-334f6f69c22e\") " pod="openstack/nova-api-db-create-mdxf5" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.201665 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-25f9-account-create-update-6rrdk"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.203412 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25f9-account-create-update-6rrdk" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.209310 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.211354 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-25f9-account-create-update-6rrdk"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.213911 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbmjk\" (UniqueName: \"kubernetes.io/projected/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-kube-api-access-kbmjk\") pod \"nova-api-db-create-mdxf5\" (UID: \"c6fea6b7-8fd2-42c4-8016-334f6f69c22e\") " pod="openstack/nova-api-db-create-mdxf5" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.293663 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mt4zt"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.293995 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-log-httpd\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294051 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-scripts\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294086 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294138 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-run-httpd\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294184 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294203 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-config-data\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294237 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75jnr\" (UniqueName: \"kubernetes.io/projected/16277119-789c-4d6e-8965-1ab0080f0871-kube-api-access-75jnr\") pod \"nova-api-25f9-account-create-update-6rrdk\" (UID: \"16277119-789c-4d6e-8965-1ab0080f0871\") " pod="openstack/nova-api-25f9-account-create-update-6rrdk" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294278 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvn7g\" (UniqueName: \"kubernetes.io/projected/7e4d3d5e-2374-4583-8193-bcef8b16110e-kube-api-access-rvn7g\") pod \"nova-cell0-db-create-5rbfs\" (UID: \"7e4d3d5e-2374-4583-8193-bcef8b16110e\") " pod="openstack/nova-cell0-db-create-5rbfs" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294308 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16277119-789c-4d6e-8965-1ab0080f0871-operator-scripts\") pod \"nova-api-25f9-account-create-update-6rrdk\" (UID: \"16277119-789c-4d6e-8965-1ab0080f0871\") " pod="openstack/nova-api-25f9-account-create-update-6rrdk" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294332 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlv8v\" (UniqueName: \"kubernetes.io/projected/4b11fccf-eea3-450d-b460-013086becb0c-kube-api-access-zlv8v\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294354 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e4d3d5e-2374-4583-8193-bcef8b16110e-operator-scripts\") pod \"nova-cell0-db-create-5rbfs\" (UID: \"7e4d3d5e-2374-4583-8193-bcef8b16110e\") " pod="openstack/nova-cell0-db-create-5rbfs" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.294972 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e4d3d5e-2374-4583-8193-bcef8b16110e-operator-scripts\") pod \"nova-cell0-db-create-5rbfs\" (UID: \"7e4d3d5e-2374-4583-8193-bcef8b16110e\") " pod="openstack/nova-cell0-db-create-5rbfs" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.295394 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-run-httpd\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.295654 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mt4zt" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.296373 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-log-httpd\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.300560 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-scripts\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.303160 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.305551 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-config-data\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.311697 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.314544 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mdxf5" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.317175 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlv8v\" (UniqueName: \"kubernetes.io/projected/4b11fccf-eea3-450d-b460-013086becb0c-kube-api-access-zlv8v\") pod \"ceilometer-0\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.319412 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e26e-account-create-update-clwzq"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.321105 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e26e-account-create-update-clwzq" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.326526 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvn7g\" (UniqueName: \"kubernetes.io/projected/7e4d3d5e-2374-4583-8193-bcef8b16110e-kube-api-access-rvn7g\") pod \"nova-cell0-db-create-5rbfs\" (UID: \"7e4d3d5e-2374-4583-8193-bcef8b16110e\") " pod="openstack/nova-cell0-db-create-5rbfs" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.326687 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.356192 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mt4zt"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.388737 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e26e-account-create-update-clwzq"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.397256 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55htc\" (UniqueName: \"kubernetes.io/projected/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-kube-api-access-55htc\") pod \"nova-cell0-e26e-account-create-update-clwzq\" (UID: \"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd\") " pod="openstack/nova-cell0-e26e-account-create-update-clwzq" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.397346 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75jnr\" (UniqueName: \"kubernetes.io/projected/16277119-789c-4d6e-8965-1ab0080f0871-kube-api-access-75jnr\") pod \"nova-api-25f9-account-create-update-6rrdk\" (UID: \"16277119-789c-4d6e-8965-1ab0080f0871\") " pod="openstack/nova-api-25f9-account-create-update-6rrdk" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.397377 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nn4s\" (UniqueName: \"kubernetes.io/projected/e54fecbe-7055-446e-989b-eddbbbfe55a6-kube-api-access-2nn4s\") pod \"nova-cell1-db-create-mt4zt\" (UID: \"e54fecbe-7055-446e-989b-eddbbbfe55a6\") " pod="openstack/nova-cell1-db-create-mt4zt" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.397425 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16277119-789c-4d6e-8965-1ab0080f0871-operator-scripts\") pod \"nova-api-25f9-account-create-update-6rrdk\" (UID: \"16277119-789c-4d6e-8965-1ab0080f0871\") " pod="openstack/nova-api-25f9-account-create-update-6rrdk" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.397448 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54fecbe-7055-446e-989b-eddbbbfe55a6-operator-scripts\") pod \"nova-cell1-db-create-mt4zt\" (UID: \"e54fecbe-7055-446e-989b-eddbbbfe55a6\") " pod="openstack/nova-cell1-db-create-mt4zt" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.397548 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-operator-scripts\") pod \"nova-cell0-e26e-account-create-update-clwzq\" (UID: \"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd\") " pod="openstack/nova-cell0-e26e-account-create-update-clwzq" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.401723 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16277119-789c-4d6e-8965-1ab0080f0871-operator-scripts\") pod \"nova-api-25f9-account-create-update-6rrdk\" (UID: \"16277119-789c-4d6e-8965-1ab0080f0871\") " pod="openstack/nova-api-25f9-account-create-update-6rrdk" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.417536 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75jnr\" (UniqueName: \"kubernetes.io/projected/16277119-789c-4d6e-8965-1ab0080f0871-kube-api-access-75jnr\") pod \"nova-api-25f9-account-create-update-6rrdk\" (UID: \"16277119-789c-4d6e-8965-1ab0080f0871\") " pod="openstack/nova-api-25f9-account-create-update-6rrdk" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.433250 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25f9-account-create-update-6rrdk" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.449560 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.499102 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nn4s\" (UniqueName: \"kubernetes.io/projected/e54fecbe-7055-446e-989b-eddbbbfe55a6-kube-api-access-2nn4s\") pod \"nova-cell1-db-create-mt4zt\" (UID: \"e54fecbe-7055-446e-989b-eddbbbfe55a6\") " pod="openstack/nova-cell1-db-create-mt4zt" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.499173 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54fecbe-7055-446e-989b-eddbbbfe55a6-operator-scripts\") pod \"nova-cell1-db-create-mt4zt\" (UID: \"e54fecbe-7055-446e-989b-eddbbbfe55a6\") " pod="openstack/nova-cell1-db-create-mt4zt" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.499257 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-operator-scripts\") pod \"nova-cell0-e26e-account-create-update-clwzq\" (UID: \"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd\") " pod="openstack/nova-cell0-e26e-account-create-update-clwzq" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.499297 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55htc\" (UniqueName: \"kubernetes.io/projected/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-kube-api-access-55htc\") pod \"nova-cell0-e26e-account-create-update-clwzq\" (UID: \"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd\") " pod="openstack/nova-cell0-e26e-account-create-update-clwzq" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.500222 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-operator-scripts\") pod \"nova-cell0-e26e-account-create-update-clwzq\" (UID: \"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd\") " pod="openstack/nova-cell0-e26e-account-create-update-clwzq" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.500225 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54fecbe-7055-446e-989b-eddbbbfe55a6-operator-scripts\") pod \"nova-cell1-db-create-mt4zt\" (UID: \"e54fecbe-7055-446e-989b-eddbbbfe55a6\") " pod="openstack/nova-cell1-db-create-mt4zt" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.508300 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b603178f-e90b-4e08-ad82-0d15ddc32844" path="/var/lib/kubelet/pods/b603178f-e90b-4e08-ad82-0d15ddc32844/volumes" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.526101 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-fb5f-account-create-update-dztfr"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.527140 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.537551 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.542305 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nn4s\" (UniqueName: \"kubernetes.io/projected/e54fecbe-7055-446e-989b-eddbbbfe55a6-kube-api-access-2nn4s\") pod \"nova-cell1-db-create-mt4zt\" (UID: \"e54fecbe-7055-446e-989b-eddbbbfe55a6\") " pod="openstack/nova-cell1-db-create-mt4zt" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.542722 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55htc\" (UniqueName: \"kubernetes.io/projected/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-kube-api-access-55htc\") pod \"nova-cell0-e26e-account-create-update-clwzq\" (UID: \"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd\") " pod="openstack/nova-cell0-e26e-account-create-update-clwzq" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.570120 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fb5f-account-create-update-dztfr"] Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.606567 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5rbfs" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.717121 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292fda17-d4a4-4bee-ba75-d3221d870f63-operator-scripts\") pod \"nova-cell1-fb5f-account-create-update-dztfr\" (UID: \"292fda17-d4a4-4bee-ba75-d3221d870f63\") " pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.717489 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4t8v\" (UniqueName: \"kubernetes.io/projected/292fda17-d4a4-4bee-ba75-d3221d870f63-kube-api-access-v4t8v\") pod \"nova-cell1-fb5f-account-create-update-dztfr\" (UID: \"292fda17-d4a4-4bee-ba75-d3221d870f63\") " pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.746526 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mt4zt" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.755255 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e26e-account-create-update-clwzq" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.818998 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292fda17-d4a4-4bee-ba75-d3221d870f63-operator-scripts\") pod \"nova-cell1-fb5f-account-create-update-dztfr\" (UID: \"292fda17-d4a4-4bee-ba75-d3221d870f63\") " pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.819121 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4t8v\" (UniqueName: \"kubernetes.io/projected/292fda17-d4a4-4bee-ba75-d3221d870f63-kube-api-access-v4t8v\") pod \"nova-cell1-fb5f-account-create-update-dztfr\" (UID: \"292fda17-d4a4-4bee-ba75-d3221d870f63\") " pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.820798 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292fda17-d4a4-4bee-ba75-d3221d870f63-operator-scripts\") pod \"nova-cell1-fb5f-account-create-update-dztfr\" (UID: \"292fda17-d4a4-4bee-ba75-d3221d870f63\") " pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.843086 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4t8v\" (UniqueName: \"kubernetes.io/projected/292fda17-d4a4-4bee-ba75-d3221d870f63-kube-api-access-v4t8v\") pod \"nova-cell1-fb5f-account-create-update-dztfr\" (UID: \"292fda17-d4a4-4bee-ba75-d3221d870f63\") " pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" Mar 20 08:46:29 crc kubenswrapper[4903]: I0320 08:46:29.857512 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.272111 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.309291 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.309632 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c68a4ea3-7086-430c-8a78-a4c06ed7280c" containerName="kube-state-metrics" containerID="cri-o://0a9e3dfbd5e32190c5719e3a6f48141efd8d572d7580de2fe86b186aa11f1165" gracePeriod=30 Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.412494 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.422516 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mdxf5"] Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.442797 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-25f9-account-create-update-6rrdk"] Mar 20 08:46:30 crc kubenswrapper[4903]: W0320 08:46:30.527795 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16277119_789c_4d6e_8965_1ab0080f0871.slice/crio-5d3c082e0ee4fa1e68e35e162e272acf2360ec30cd4bad3e11d6d820693a6488 WatchSource:0}: Error finding container 5d3c082e0ee4fa1e68e35e162e272acf2360ec30cd4bad3e11d6d820693a6488: Status 404 returned error can't find the container with id 5d3c082e0ee4fa1e68e35e162e272acf2360ec30cd4bad3e11d6d820693a6488 Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.656357 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.664002 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.739819 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5rbfs"] Mar 20 08:46:30 crc kubenswrapper[4903]: W0320 08:46:30.744961 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod292fda17_d4a4_4bee_ba75_d3221d870f63.slice/crio-403e3e4bcb5b6f140971b71dfa2704da76ad05d14bb26a115b4c19a1469bfb5a WatchSource:0}: Error finding container 403e3e4bcb5b6f140971b71dfa2704da76ad05d14bb26a115b4c19a1469bfb5a: Status 404 returned error can't find the container with id 403e3e4bcb5b6f140971b71dfa2704da76ad05d14bb26a115b4c19a1469bfb5a Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.771751 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fb5f-account-create-update-dztfr"] Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.784797 4903 generic.go:334] "Generic (PLEG): container finished" podID="c68a4ea3-7086-430c-8a78-a4c06ed7280c" containerID="0a9e3dfbd5e32190c5719e3a6f48141efd8d572d7580de2fe86b186aa11f1165" exitCode=2 Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.784896 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c68a4ea3-7086-430c-8a78-a4c06ed7280c","Type":"ContainerDied","Data":"0a9e3dfbd5e32190c5719e3a6f48141efd8d572d7580de2fe86b186aa11f1165"} Mar 20 08:46:30 crc kubenswrapper[4903]: W0320 08:46:30.791894 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode54fecbe_7055_446e_989b_eddbbbfe55a6.slice/crio-4e1201db9928b6f2fbd13d4cd2792b658ff4628523dce3b0c862883d79ce0a17 WatchSource:0}: Error finding container 4e1201db9928b6f2fbd13d4cd2792b658ff4628523dce3b0c862883d79ce0a17: Status 404 returned error can't find the container with id 4e1201db9928b6f2fbd13d4cd2792b658ff4628523dce3b0c862883d79ce0a17 Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.796358 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e26e-account-create-update-clwzq"] Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.798656 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25f9-account-create-update-6rrdk" event={"ID":"16277119-789c-4d6e-8965-1ab0080f0871","Type":"ContainerStarted","Data":"5d3c082e0ee4fa1e68e35e162e272acf2360ec30cd4bad3e11d6d820693a6488"} Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.803371 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b11fccf-eea3-450d-b460-013086becb0c","Type":"ContainerStarted","Data":"bfc7cf95e092b9e7eb9f528611dbcf47b438ab66e51a94cb8fec02bc4dccda14"} Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.806295 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mdxf5" event={"ID":"c6fea6b7-8fd2-42c4-8016-334f6f69c22e","Type":"ContainerStarted","Data":"a756e85ccca4487ea67d662c0fac98fc112588724a6887909dd239fb77047f90"} Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.811099 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mt4zt"] Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.963416 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.974519 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wv26\" (UniqueName: \"kubernetes.io/projected/c68a4ea3-7086-430c-8a78-a4c06ed7280c-kube-api-access-9wv26\") pod \"c68a4ea3-7086-430c-8a78-a4c06ed7280c\" (UID: \"c68a4ea3-7086-430c-8a78-a4c06ed7280c\") " Mar 20 08:46:30 crc kubenswrapper[4903]: I0320 08:46:30.994320 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68a4ea3-7086-430c-8a78-a4c06ed7280c-kube-api-access-9wv26" (OuterVolumeSpecName: "kube-api-access-9wv26") pod "c68a4ea3-7086-430c-8a78-a4c06ed7280c" (UID: "c68a4ea3-7086-430c-8a78-a4c06ed7280c"). InnerVolumeSpecName "kube-api-access-9wv26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.078662 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wv26\" (UniqueName: \"kubernetes.io/projected/c68a4ea3-7086-430c-8a78-a4c06ed7280c-kube-api-access-9wv26\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.225400 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.226299 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e761f2dc-041e-4811-802b-2a8e8c376381" containerName="glance-log" containerID="cri-o://952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c" gracePeriod=30 Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.226787 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e761f2dc-041e-4811-802b-2a8e8c376381" containerName="glance-httpd" containerID="cri-o://6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5" gracePeriod=30 Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.545676 4903 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod482e4efe-5d05-4aa5-b77c-a366193c4b50"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod482e4efe-5d05-4aa5-b77c-a366193c4b50] : Timed out while waiting for systemd to remove kubepods-besteffort-pod482e4efe_5d05_4aa5_b77c_a366193c4b50.slice" Mar 20 08:46:31 crc kubenswrapper[4903]: E0320 08:46:31.545764 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod482e4efe-5d05-4aa5-b77c-a366193c4b50] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod482e4efe-5d05-4aa5-b77c-a366193c4b50] : Timed out while waiting for systemd to remove kubepods-besteffort-pod482e4efe_5d05_4aa5_b77c_a366193c4b50.slice" pod="openstack/cinder-api-0" podUID="482e4efe-5d05-4aa5-b77c-a366193c4b50" Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.817490 4903 generic.go:334] "Generic (PLEG): container finished" podID="c6fea6b7-8fd2-42c4-8016-334f6f69c22e" containerID="7c33a4aacf7c993156b0df84419ae09f7daa2d31a1f364e7cd90f2b85802e23c" exitCode=0 Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.817664 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mdxf5" event={"ID":"c6fea6b7-8fd2-42c4-8016-334f6f69c22e","Type":"ContainerDied","Data":"7c33a4aacf7c993156b0df84419ae09f7daa2d31a1f364e7cd90f2b85802e23c"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.826149 4903 generic.go:334] "Generic (PLEG): container finished" podID="e54fecbe-7055-446e-989b-eddbbbfe55a6" containerID="18781c07609ad0d94ec5f46ebf8058362d29f12af1242feb7d305ab16b0765b6" exitCode=0 Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.826247 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mt4zt" event={"ID":"e54fecbe-7055-446e-989b-eddbbbfe55a6","Type":"ContainerDied","Data":"18781c07609ad0d94ec5f46ebf8058362d29f12af1242feb7d305ab16b0765b6"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.826285 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mt4zt" event={"ID":"e54fecbe-7055-446e-989b-eddbbbfe55a6","Type":"ContainerStarted","Data":"4e1201db9928b6f2fbd13d4cd2792b658ff4628523dce3b0c862883d79ce0a17"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.847809 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b11fccf-eea3-450d-b460-013086becb0c","Type":"ContainerStarted","Data":"09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.851001 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.851098 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c68a4ea3-7086-430c-8a78-a4c06ed7280c","Type":"ContainerDied","Data":"66b814afd37a94e3be1c157d99f158390e80fa86e45a769f589683a1498e8170"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.851253 4903 scope.go:117] "RemoveContainer" containerID="0a9e3dfbd5e32190c5719e3a6f48141efd8d572d7580de2fe86b186aa11f1165" Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.860166 4903 generic.go:334] "Generic (PLEG): container finished" podID="16277119-789c-4d6e-8965-1ab0080f0871" containerID="5ec65c1f6995adbacb8d006dd9be17bea8e020a50d125daca97eaf0e3f044f8b" exitCode=0 Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.860369 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25f9-account-create-update-6rrdk" event={"ID":"16277119-789c-4d6e-8965-1ab0080f0871","Type":"ContainerDied","Data":"5ec65c1f6995adbacb8d006dd9be17bea8e020a50d125daca97eaf0e3f044f8b"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.889202 4903 generic.go:334] "Generic (PLEG): container finished" podID="292fda17-d4a4-4bee-ba75-d3221d870f63" containerID="3fd97498ef71fcafedc36c00ac0f24cd90f9cbaeb360b1a6cbd7d84155f621c6" exitCode=0 Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.889282 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" event={"ID":"292fda17-d4a4-4bee-ba75-d3221d870f63","Type":"ContainerDied","Data":"3fd97498ef71fcafedc36c00ac0f24cd90f9cbaeb360b1a6cbd7d84155f621c6"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.889532 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" event={"ID":"292fda17-d4a4-4bee-ba75-d3221d870f63","Type":"ContainerStarted","Data":"403e3e4bcb5b6f140971b71dfa2704da76ad05d14bb26a115b4c19a1469bfb5a"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.894935 4903 generic.go:334] "Generic (PLEG): container finished" podID="7e4d3d5e-2374-4583-8193-bcef8b16110e" containerID="5a72ab8d51aaf45e6540e7fe5f23555acdf41834cf991b22ab1ef61e697dffa1" exitCode=0 Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.894998 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5rbfs" event={"ID":"7e4d3d5e-2374-4583-8193-bcef8b16110e","Type":"ContainerDied","Data":"5a72ab8d51aaf45e6540e7fe5f23555acdf41834cf991b22ab1ef61e697dffa1"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.896122 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5rbfs" event={"ID":"7e4d3d5e-2374-4583-8193-bcef8b16110e","Type":"ContainerStarted","Data":"0294c6a4274390b2b5b26147b74c6953dfcf23c374c62de229958965b0cf62d8"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.899566 4903 generic.go:334] "Generic (PLEG): container finished" podID="cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd" containerID="ad0a1b89ae2c9dff077b2a87669be8cab9c456620225b30cbba84adc397814f5" exitCode=0 Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.899684 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e26e-account-create-update-clwzq" event={"ID":"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd","Type":"ContainerDied","Data":"ad0a1b89ae2c9dff077b2a87669be8cab9c456620225b30cbba84adc397814f5"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.899985 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e26e-account-create-update-clwzq" event={"ID":"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd","Type":"ContainerStarted","Data":"a76e939c656952c3e50826fc57b881e091b189c7c7e5f71d0c51013749d28ac5"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.905807 4903 generic.go:334] "Generic (PLEG): container finished" podID="e761f2dc-041e-4811-802b-2a8e8c376381" containerID="952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c" exitCode=143 Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.905885 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.905932 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e761f2dc-041e-4811-802b-2a8e8c376381","Type":"ContainerDied","Data":"952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c"} Mar 20 08:46:31 crc kubenswrapper[4903]: I0320 08:46:31.994487 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.006679 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.029490 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.044849 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.057079 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:46:32 crc kubenswrapper[4903]: E0320 08:46:32.057799 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68a4ea3-7086-430c-8a78-a4c06ed7280c" containerName="kube-state-metrics" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.057823 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68a4ea3-7086-430c-8a78-a4c06ed7280c" containerName="kube-state-metrics" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.058146 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68a4ea3-7086-430c-8a78-a4c06ed7280c" containerName="kube-state-metrics" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.058999 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.061864 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.062052 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.071986 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.089424 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.104164 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.109605 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.109952 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.111795 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.137900 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208025 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208128 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208159 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data-custom\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208217 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50b5adcb-aed8-4cff-b3ec-02721df3937d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208240 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208264 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208303 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208353 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-886g9\" (UniqueName: \"kubernetes.io/projected/50b5adcb-aed8-4cff-b3ec-02721df3937d-kube-api-access-886g9\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208383 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208401 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88b6\" (UniqueName: \"kubernetes.io/projected/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-api-access-d88b6\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208437 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b5adcb-aed8-4cff-b3ec-02721df3937d-logs\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208489 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.208525 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-scripts\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311387 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d88b6\" (UniqueName: \"kubernetes.io/projected/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-api-access-d88b6\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311451 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311517 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b5adcb-aed8-4cff-b3ec-02721df3937d-logs\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311620 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311667 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-scripts\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311724 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311754 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311786 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data-custom\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311814 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50b5adcb-aed8-4cff-b3ec-02721df3937d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311843 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311889 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311936 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.311991 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-886g9\" (UniqueName: \"kubernetes.io/projected/50b5adcb-aed8-4cff-b3ec-02721df3937d-kube-api-access-886g9\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.312160 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b5adcb-aed8-4cff-b3ec-02721df3937d-logs\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.312903 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50b5adcb-aed8-4cff-b3ec-02721df3937d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.320938 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.321978 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.321989 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data-custom\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.322136 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.322154 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.322535 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-scripts\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.323180 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.323816 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.331676 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.334333 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-886g9\" (UniqueName: \"kubernetes.io/projected/50b5adcb-aed8-4cff-b3ec-02721df3937d-kube-api-access-886g9\") pod \"cinder-api-0\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.349201 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88b6\" (UniqueName: \"kubernetes.io/projected/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-api-access-d88b6\") pod \"kube-state-metrics-0\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.387431 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.444690 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.487320 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.488179 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" containerName="glance-log" containerID="cri-o://3b5d10559c5d44263b5a6e122434a5c7503ceaee8c0fcc56d0ab6b2eb14c36dd" gracePeriod=30 Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.488945 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" containerName="glance-httpd" containerID="cri-o://40b4b5aa3e9d277b5d7629a27ef27b096fa2b070096c494efac9d95b45d321d0" gracePeriod=30 Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.917620 4903 generic.go:334] "Generic (PLEG): container finished" podID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" containerID="3b5d10559c5d44263b5a6e122434a5c7503ceaee8c0fcc56d0ab6b2eb14c36dd" exitCode=143 Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.917716 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83dc9a0f-80b3-4df8-9b1b-f233484cb285","Type":"ContainerDied","Data":"3b5d10559c5d44263b5a6e122434a5c7503ceaee8c0fcc56d0ab6b2eb14c36dd"} Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.921248 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b11fccf-eea3-450d-b460-013086becb0c","Type":"ContainerStarted","Data":"f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a"} Mar 20 08:46:32 crc kubenswrapper[4903]: I0320 08:46:32.963172 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:46:32 crc kubenswrapper[4903]: W0320 08:46:32.981472 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8bb60e5_f963_44ed_9e5e_76ca6da5c723.slice/crio-67121147e76913a52a7a4bc86daef660e583d5da1143d57f716b842a0e33a1d9 WatchSource:0}: Error finding container 67121147e76913a52a7a4bc86daef660e583d5da1143d57f716b842a0e33a1d9: Status 404 returned error can't find the container with id 67121147e76913a52a7a4bc86daef660e583d5da1143d57f716b842a0e33a1d9 Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.068270 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.312674 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mdxf5" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.436404 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbmjk\" (UniqueName: \"kubernetes.io/projected/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-kube-api-access-kbmjk\") pod \"c6fea6b7-8fd2-42c4-8016-334f6f69c22e\" (UID: \"c6fea6b7-8fd2-42c4-8016-334f6f69c22e\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.436495 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-operator-scripts\") pod \"c6fea6b7-8fd2-42c4-8016-334f6f69c22e\" (UID: \"c6fea6b7-8fd2-42c4-8016-334f6f69c22e\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.437704 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6fea6b7-8fd2-42c4-8016-334f6f69c22e" (UID: "c6fea6b7-8fd2-42c4-8016-334f6f69c22e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.445223 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-kube-api-access-kbmjk" (OuterVolumeSpecName: "kube-api-access-kbmjk") pod "c6fea6b7-8fd2-42c4-8016-334f6f69c22e" (UID: "c6fea6b7-8fd2-42c4-8016-334f6f69c22e"). InnerVolumeSpecName "kube-api-access-kbmjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.510511 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482e4efe-5d05-4aa5-b77c-a366193c4b50" path="/var/lib/kubelet/pods/482e4efe-5d05-4aa5-b77c-a366193c4b50/volumes" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.511386 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68a4ea3-7086-430c-8a78-a4c06ed7280c" path="/var/lib/kubelet/pods/c68a4ea3-7086-430c-8a78-a4c06ed7280c/volumes" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.550147 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbmjk\" (UniqueName: \"kubernetes.io/projected/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-kube-api-access-kbmjk\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.550190 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6fea6b7-8fd2-42c4-8016-334f6f69c22e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.740798 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25f9-account-create-update-6rrdk" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.747749 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e26e-account-create-update-clwzq" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.747871 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mt4zt" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.763006 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.836602 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5rbfs" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.862024 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292fda17-d4a4-4bee-ba75-d3221d870f63-operator-scripts\") pod \"292fda17-d4a4-4bee-ba75-d3221d870f63\" (UID: \"292fda17-d4a4-4bee-ba75-d3221d870f63\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.862247 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55htc\" (UniqueName: \"kubernetes.io/projected/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-kube-api-access-55htc\") pod \"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd\" (UID: \"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.862481 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16277119-789c-4d6e-8965-1ab0080f0871-operator-scripts\") pod \"16277119-789c-4d6e-8965-1ab0080f0871\" (UID: \"16277119-789c-4d6e-8965-1ab0080f0871\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.862534 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54fecbe-7055-446e-989b-eddbbbfe55a6-operator-scripts\") pod \"e54fecbe-7055-446e-989b-eddbbbfe55a6\" (UID: \"e54fecbe-7055-446e-989b-eddbbbfe55a6\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.862602 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-operator-scripts\") pod \"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd\" (UID: \"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.862666 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nn4s\" (UniqueName: \"kubernetes.io/projected/e54fecbe-7055-446e-989b-eddbbbfe55a6-kube-api-access-2nn4s\") pod \"e54fecbe-7055-446e-989b-eddbbbfe55a6\" (UID: \"e54fecbe-7055-446e-989b-eddbbbfe55a6\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.862698 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75jnr\" (UniqueName: \"kubernetes.io/projected/16277119-789c-4d6e-8965-1ab0080f0871-kube-api-access-75jnr\") pod \"16277119-789c-4d6e-8965-1ab0080f0871\" (UID: \"16277119-789c-4d6e-8965-1ab0080f0871\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.862756 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4t8v\" (UniqueName: \"kubernetes.io/projected/292fda17-d4a4-4bee-ba75-d3221d870f63-kube-api-access-v4t8v\") pod \"292fda17-d4a4-4bee-ba75-d3221d870f63\" (UID: \"292fda17-d4a4-4bee-ba75-d3221d870f63\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.863144 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/292fda17-d4a4-4bee-ba75-d3221d870f63-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "292fda17-d4a4-4bee-ba75-d3221d870f63" (UID: "292fda17-d4a4-4bee-ba75-d3221d870f63"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.863916 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/292fda17-d4a4-4bee-ba75-d3221d870f63-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.864456 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54fecbe-7055-446e-989b-eddbbbfe55a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e54fecbe-7055-446e-989b-eddbbbfe55a6" (UID: "e54fecbe-7055-446e-989b-eddbbbfe55a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.865117 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16277119-789c-4d6e-8965-1ab0080f0871-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16277119-789c-4d6e-8965-1ab0080f0871" (UID: "16277119-789c-4d6e-8965-1ab0080f0871"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.865395 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd" (UID: "cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.873670 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16277119-789c-4d6e-8965-1ab0080f0871-kube-api-access-75jnr" (OuterVolumeSpecName: "kube-api-access-75jnr") pod "16277119-789c-4d6e-8965-1ab0080f0871" (UID: "16277119-789c-4d6e-8965-1ab0080f0871"). InnerVolumeSpecName "kube-api-access-75jnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.873722 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/292fda17-d4a4-4bee-ba75-d3221d870f63-kube-api-access-v4t8v" (OuterVolumeSpecName: "kube-api-access-v4t8v") pod "292fda17-d4a4-4bee-ba75-d3221d870f63" (UID: "292fda17-d4a4-4bee-ba75-d3221d870f63"). InnerVolumeSpecName "kube-api-access-v4t8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.875913 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-kube-api-access-55htc" (OuterVolumeSpecName: "kube-api-access-55htc") pod "cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd" (UID: "cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd"). InnerVolumeSpecName "kube-api-access-55htc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.876023 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54fecbe-7055-446e-989b-eddbbbfe55a6-kube-api-access-2nn4s" (OuterVolumeSpecName: "kube-api-access-2nn4s") pod "e54fecbe-7055-446e-989b-eddbbbfe55a6" (UID: "e54fecbe-7055-446e-989b-eddbbbfe55a6"). InnerVolumeSpecName "kube-api-access-2nn4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.944563 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25f9-account-create-update-6rrdk" event={"ID":"16277119-789c-4d6e-8965-1ab0080f0871","Type":"ContainerDied","Data":"5d3c082e0ee4fa1e68e35e162e272acf2360ec30cd4bad3e11d6d820693a6488"} Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.944613 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3c082e0ee4fa1e68e35e162e272acf2360ec30cd4bad3e11d6d820693a6488" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.944683 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25f9-account-create-update-6rrdk" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.955731 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.955891 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb5f-account-create-update-dztfr" event={"ID":"292fda17-d4a4-4bee-ba75-d3221d870f63","Type":"ContainerDied","Data":"403e3e4bcb5b6f140971b71dfa2704da76ad05d14bb26a115b4c19a1469bfb5a"} Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.955939 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="403e3e4bcb5b6f140971b71dfa2704da76ad05d14bb26a115b4c19a1469bfb5a" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.957754 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"50b5adcb-aed8-4cff-b3ec-02721df3937d","Type":"ContainerStarted","Data":"8b2af6fd3e4c816e215b1b167a8260cf0e8e49daa50f275b197c42f365dee587"} Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.961522 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mt4zt" event={"ID":"e54fecbe-7055-446e-989b-eddbbbfe55a6","Type":"ContainerDied","Data":"4e1201db9928b6f2fbd13d4cd2792b658ff4628523dce3b0c862883d79ce0a17"} Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.961561 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e1201db9928b6f2fbd13d4cd2792b658ff4628523dce3b0c862883d79ce0a17" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.961648 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mt4zt" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.964995 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e4d3d5e-2374-4583-8193-bcef8b16110e-operator-scripts\") pod \"7e4d3d5e-2374-4583-8193-bcef8b16110e\" (UID: \"7e4d3d5e-2374-4583-8193-bcef8b16110e\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.965070 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvn7g\" (UniqueName: \"kubernetes.io/projected/7e4d3d5e-2374-4583-8193-bcef8b16110e-kube-api-access-rvn7g\") pod \"7e4d3d5e-2374-4583-8193-bcef8b16110e\" (UID: \"7e4d3d5e-2374-4583-8193-bcef8b16110e\") " Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.965475 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16277119-789c-4d6e-8965-1ab0080f0871-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.965495 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e54fecbe-7055-446e-989b-eddbbbfe55a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.965507 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.965523 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nn4s\" (UniqueName: \"kubernetes.io/projected/e54fecbe-7055-446e-989b-eddbbbfe55a6-kube-api-access-2nn4s\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.965534 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75jnr\" (UniqueName: \"kubernetes.io/projected/16277119-789c-4d6e-8965-1ab0080f0871-kube-api-access-75jnr\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.965544 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4t8v\" (UniqueName: \"kubernetes.io/projected/292fda17-d4a4-4bee-ba75-d3221d870f63-kube-api-access-v4t8v\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.965553 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55htc\" (UniqueName: \"kubernetes.io/projected/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd-kube-api-access-55htc\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.966263 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e4d3d5e-2374-4583-8193-bcef8b16110e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e4d3d5e-2374-4583-8193-bcef8b16110e" (UID: "7e4d3d5e-2374-4583-8193-bcef8b16110e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.968446 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e4d3d5e-2374-4583-8193-bcef8b16110e-kube-api-access-rvn7g" (OuterVolumeSpecName: "kube-api-access-rvn7g") pod "7e4d3d5e-2374-4583-8193-bcef8b16110e" (UID: "7e4d3d5e-2374-4583-8193-bcef8b16110e"). InnerVolumeSpecName "kube-api-access-rvn7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.976756 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mdxf5" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.976912 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mdxf5" event={"ID":"c6fea6b7-8fd2-42c4-8016-334f6f69c22e","Type":"ContainerDied","Data":"a756e85ccca4487ea67d662c0fac98fc112588724a6887909dd239fb77047f90"} Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.976942 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a756e85ccca4487ea67d662c0fac98fc112588724a6887909dd239fb77047f90" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.982784 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5rbfs" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.982818 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5rbfs" event={"ID":"7e4d3d5e-2374-4583-8193-bcef8b16110e","Type":"ContainerDied","Data":"0294c6a4274390b2b5b26147b74c6953dfcf23c374c62de229958965b0cf62d8"} Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.982851 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0294c6a4274390b2b5b26147b74c6953dfcf23c374c62de229958965b0cf62d8" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.986756 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e26e-account-create-update-clwzq" event={"ID":"cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd","Type":"ContainerDied","Data":"a76e939c656952c3e50826fc57b881e091b189c7c7e5f71d0c51013749d28ac5"} Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.986799 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a76e939c656952c3e50826fc57b881e091b189c7c7e5f71d0c51013749d28ac5" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.986863 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e26e-account-create-update-clwzq" Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.993118 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b11fccf-eea3-450d-b460-013086becb0c","Type":"ContainerStarted","Data":"8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623"} Mar 20 08:46:33 crc kubenswrapper[4903]: I0320 08:46:33.995025 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8bb60e5-f963-44ed-9e5e-76ca6da5c723","Type":"ContainerStarted","Data":"67121147e76913a52a7a4bc86daef660e583d5da1143d57f716b842a0e33a1d9"} Mar 20 08:46:34 crc kubenswrapper[4903]: I0320 08:46:34.067541 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e4d3d5e-2374-4583-8193-bcef8b16110e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:34 crc kubenswrapper[4903]: I0320 08:46:34.067585 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvn7g\" (UniqueName: \"kubernetes.io/projected/7e4d3d5e-2374-4583-8193-bcef8b16110e-kube-api-access-rvn7g\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.037701 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"50b5adcb-aed8-4cff-b3ec-02721df3937d","Type":"ContainerStarted","Data":"bbc28588129fed5e832d9cf2c208bd4c746332410777ee79ad509494e640c235"} Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.039223 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.041602 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8bb60e5-f963-44ed-9e5e-76ca6da5c723","Type":"ContainerStarted","Data":"99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa"} Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.042452 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.048874 4903 generic.go:334] "Generic (PLEG): container finished" podID="e761f2dc-041e-4811-802b-2a8e8c376381" containerID="6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5" exitCode=0 Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.048935 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e761f2dc-041e-4811-802b-2a8e8c376381","Type":"ContainerDied","Data":"6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5"} Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.048973 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e761f2dc-041e-4811-802b-2a8e8c376381","Type":"ContainerDied","Data":"a533a7a79244057873761921a865688d4492459313f2c7f8338ba32d85542bff"} Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.048990 4903 scope.go:117] "RemoveContainer" containerID="6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.049158 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.115831 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.540912866 podStartE2EDuration="4.115808115s" podCreationTimestamp="2026-03-20 08:46:31 +0000 UTC" firstStartedPulling="2026-03-20 08:46:32.983272976 +0000 UTC m=+1418.200173291" lastFinishedPulling="2026-03-20 08:46:33.558168235 +0000 UTC m=+1418.775068540" observedRunningTime="2026-03-20 08:46:35.115790404 +0000 UTC m=+1420.332690729" watchObservedRunningTime="2026-03-20 08:46:35.115808115 +0000 UTC m=+1420.332708430" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.121202 4903 scope.go:117] "RemoveContainer" containerID="952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.203482 4903 scope.go:117] "RemoveContainer" containerID="6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5" Mar 20 08:46:35 crc kubenswrapper[4903]: E0320 08:46:35.206314 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5\": container with ID starting with 6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5 not found: ID does not exist" containerID="6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.206400 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5"} err="failed to get container status \"6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5\": rpc error: code = NotFound desc = could not find container \"6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5\": container with ID starting with 6ca326a90fd9f5fc37a08f1a4417347cc25993cffc8749772e41d2e19d02fca5 not found: ID does not exist" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.206446 4903 scope.go:117] "RemoveContainer" containerID="952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c" Mar 20 08:46:35 crc kubenswrapper[4903]: E0320 08:46:35.210465 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c\": container with ID starting with 952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c not found: ID does not exist" containerID="952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.210505 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c"} err="failed to get container status \"952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c\": rpc error: code = NotFound desc = could not find container \"952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c\": container with ID starting with 952661f87146616b7a9a4cefe62281de6533fdfc897d5538236d336178c5722c not found: ID does not exist" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.211444 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-scripts\") pod \"e761f2dc-041e-4811-802b-2a8e8c376381\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.211582 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-public-tls-certs\") pod \"e761f2dc-041e-4811-802b-2a8e8c376381\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.211655 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fktf\" (UniqueName: \"kubernetes.io/projected/e761f2dc-041e-4811-802b-2a8e8c376381-kube-api-access-4fktf\") pod \"e761f2dc-041e-4811-802b-2a8e8c376381\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.211736 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"e761f2dc-041e-4811-802b-2a8e8c376381\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.211800 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-combined-ca-bundle\") pod \"e761f2dc-041e-4811-802b-2a8e8c376381\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.211833 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-config-data\") pod \"e761f2dc-041e-4811-802b-2a8e8c376381\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.211893 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-httpd-run\") pod \"e761f2dc-041e-4811-802b-2a8e8c376381\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.211928 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-logs\") pod \"e761f2dc-041e-4811-802b-2a8e8c376381\" (UID: \"e761f2dc-041e-4811-802b-2a8e8c376381\") " Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.213852 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e761f2dc-041e-4811-802b-2a8e8c376381" (UID: "e761f2dc-041e-4811-802b-2a8e8c376381"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.221794 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-logs" (OuterVolumeSpecName: "logs") pod "e761f2dc-041e-4811-802b-2a8e8c376381" (UID: "e761f2dc-041e-4811-802b-2a8e8c376381"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.238689 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-scripts" (OuterVolumeSpecName: "scripts") pod "e761f2dc-041e-4811-802b-2a8e8c376381" (UID: "e761f2dc-041e-4811-802b-2a8e8c376381"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.261474 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "e761f2dc-041e-4811-802b-2a8e8c376381" (UID: "e761f2dc-041e-4811-802b-2a8e8c376381"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.288308 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e761f2dc-041e-4811-802b-2a8e8c376381-kube-api-access-4fktf" (OuterVolumeSpecName: "kube-api-access-4fktf") pod "e761f2dc-041e-4811-802b-2a8e8c376381" (UID: "e761f2dc-041e-4811-802b-2a8e8c376381"). InnerVolumeSpecName "kube-api-access-4fktf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.314344 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fktf\" (UniqueName: \"kubernetes.io/projected/e761f2dc-041e-4811-802b-2a8e8c376381-kube-api-access-4fktf\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.314393 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.314405 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.314415 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e761f2dc-041e-4811-802b-2a8e8c376381-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.314426 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.345326 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e761f2dc-041e-4811-802b-2a8e8c376381" (UID: "e761f2dc-041e-4811-802b-2a8e8c376381"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.385280 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e761f2dc-041e-4811-802b-2a8e8c376381" (UID: "e761f2dc-041e-4811-802b-2a8e8c376381"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.408299 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-config-data" (OuterVolumeSpecName: "config-data") pod "e761f2dc-041e-4811-802b-2a8e8c376381" (UID: "e761f2dc-041e-4811-802b-2a8e8c376381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.410338 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.418321 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.418356 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.418368 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e761f2dc-041e-4811-802b-2a8e8c376381-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.418379 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.605899 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.677191 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.685403 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.721880 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:46:35 crc kubenswrapper[4903]: E0320 08:46:35.722862 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e761f2dc-041e-4811-802b-2a8e8c376381" containerName="glance-log" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.722886 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e761f2dc-041e-4811-802b-2a8e8c376381" containerName="glance-log" Mar 20 08:46:35 crc kubenswrapper[4903]: E0320 08:46:35.722909 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4d3d5e-2374-4583-8193-bcef8b16110e" containerName="mariadb-database-create" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.722916 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4d3d5e-2374-4583-8193-bcef8b16110e" containerName="mariadb-database-create" Mar 20 08:46:35 crc kubenswrapper[4903]: E0320 08:46:35.722935 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="292fda17-d4a4-4bee-ba75-d3221d870f63" containerName="mariadb-account-create-update" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.722941 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="292fda17-d4a4-4bee-ba75-d3221d870f63" containerName="mariadb-account-create-update" Mar 20 08:46:35 crc kubenswrapper[4903]: E0320 08:46:35.722952 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54fecbe-7055-446e-989b-eddbbbfe55a6" containerName="mariadb-database-create" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.722958 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54fecbe-7055-446e-989b-eddbbbfe55a6" containerName="mariadb-database-create" Mar 20 08:46:35 crc kubenswrapper[4903]: E0320 08:46:35.722972 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e761f2dc-041e-4811-802b-2a8e8c376381" containerName="glance-httpd" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.722978 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="e761f2dc-041e-4811-802b-2a8e8c376381" containerName="glance-httpd" Mar 20 08:46:35 crc kubenswrapper[4903]: E0320 08:46:35.722990 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd" containerName="mariadb-account-create-update" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.722998 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd" containerName="mariadb-account-create-update" Mar 20 08:46:35 crc kubenswrapper[4903]: E0320 08:46:35.723018 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fea6b7-8fd2-42c4-8016-334f6f69c22e" containerName="mariadb-database-create" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.723025 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fea6b7-8fd2-42c4-8016-334f6f69c22e" containerName="mariadb-database-create" Mar 20 08:46:35 crc kubenswrapper[4903]: E0320 08:46:35.723051 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16277119-789c-4d6e-8965-1ab0080f0871" containerName="mariadb-account-create-update" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.723058 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="16277119-789c-4d6e-8965-1ab0080f0871" containerName="mariadb-account-create-update" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.723230 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd" containerName="mariadb-account-create-update" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.723242 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e54fecbe-7055-446e-989b-eddbbbfe55a6" containerName="mariadb-database-create" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.723259 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fea6b7-8fd2-42c4-8016-334f6f69c22e" containerName="mariadb-database-create" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.723269 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="292fda17-d4a4-4bee-ba75-d3221d870f63" containerName="mariadb-account-create-update" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.723283 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e4d3d5e-2374-4583-8193-bcef8b16110e" containerName="mariadb-database-create" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.723305 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="16277119-789c-4d6e-8965-1ab0080f0871" containerName="mariadb-account-create-update" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.723314 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e761f2dc-041e-4811-802b-2a8e8c376381" containerName="glance-httpd" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.723325 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="e761f2dc-041e-4811-802b-2a8e8c376381" containerName="glance-log" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.724372 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.739407 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.742693 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.748275 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bdb6dfbd4-xpx45"] Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.748556 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bdb6dfbd4-xpx45" podUID="6453cb9a-76a4-412f-9cb8-964c20a217ca" containerName="neutron-api" containerID="cri-o://c8a35f3396f6369e4c3eda1b193db1137374b7218f722935ca859014ff6167e1" gracePeriod=30 Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.748606 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bdb6dfbd4-xpx45" podUID="6453cb9a-76a4-412f-9cb8-964c20a217ca" containerName="neutron-httpd" containerID="cri-o://48910b95b55fa683cebc5b748d90b941ea8034a5f459c44d84750b8def8b501e" gracePeriod=30 Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.773359 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.828785 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.828862 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzzqn\" (UniqueName: \"kubernetes.io/projected/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-kube-api-access-vzzqn\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.828892 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.828953 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.829002 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.829046 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.829082 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.829119 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-logs\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.931807 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.931877 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-logs\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.931924 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.931952 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzzqn\" (UniqueName: \"kubernetes.io/projected/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-kube-api-access-vzzqn\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.931978 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.932022 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.932080 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.932145 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.934295 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.934545 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-logs\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.938429 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.940665 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.941050 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.943283 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.969392 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzzqn\" (UniqueName: \"kubernetes.io/projected/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-kube-api-access-vzzqn\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:35 crc kubenswrapper[4903]: I0320 08:46:35.969818 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.008758 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " pod="openstack/glance-default-external-api-0" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.044476 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.081210 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"50b5adcb-aed8-4cff-b3ec-02721df3937d","Type":"ContainerStarted","Data":"dc39193e0b3efc7d58828ef8c691abe141cb7d978ac576bd15dc6776a511b5fe"} Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.081582 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.093436 4903 generic.go:334] "Generic (PLEG): container finished" podID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" containerID="40b4b5aa3e9d277b5d7629a27ef27b096fa2b070096c494efac9d95b45d321d0" exitCode=0 Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.093517 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83dc9a0f-80b3-4df8-9b1b-f233484cb285","Type":"ContainerDied","Data":"40b4b5aa3e9d277b5d7629a27ef27b096fa2b070096c494efac9d95b45d321d0"} Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.102800 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b11fccf-eea3-450d-b460-013086becb0c","Type":"ContainerStarted","Data":"f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b"} Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.102900 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.102890 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="ceilometer-central-agent" containerID="cri-o://09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab" gracePeriod=30 Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.102952 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="sg-core" containerID="cri-o://8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623" gracePeriod=30 Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.103139 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="proxy-httpd" containerID="cri-o://f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b" gracePeriod=30 Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.103144 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="ceilometer-notification-agent" containerID="cri-o://f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a" gracePeriod=30 Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.152259 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.152242954 podStartE2EDuration="5.152242954s" podCreationTimestamp="2026-03-20 08:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:46:36.113885962 +0000 UTC m=+1421.330786277" watchObservedRunningTime="2026-03-20 08:46:36.152242954 +0000 UTC m=+1421.369143269" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.158146 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.692268302 podStartE2EDuration="7.158131131s" podCreationTimestamp="2026-03-20 08:46:29 +0000 UTC" firstStartedPulling="2026-03-20 08:46:30.447517804 +0000 UTC m=+1415.664418119" lastFinishedPulling="2026-03-20 08:46:34.913380633 +0000 UTC m=+1420.130280948" observedRunningTime="2026-03-20 08:46:36.150284469 +0000 UTC m=+1421.367184784" watchObservedRunningTime="2026-03-20 08:46:36.158131131 +0000 UTC m=+1421.375031446" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.516862 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.547176 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-logs\") pod \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.547235 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-httpd-run\") pod \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.547447 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-config-data\") pod \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.547485 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bpm8\" (UniqueName: \"kubernetes.io/projected/83dc9a0f-80b3-4df8-9b1b-f233484cb285-kube-api-access-5bpm8\") pod \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.547561 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-internal-tls-certs\") pod \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.547607 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.547634 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-combined-ca-bundle\") pod \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.547663 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-scripts\") pod \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\" (UID: \"83dc9a0f-80b3-4df8-9b1b-f233484cb285\") " Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.547882 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-logs" (OuterVolumeSpecName: "logs") pod "83dc9a0f-80b3-4df8-9b1b-f233484cb285" (UID: "83dc9a0f-80b3-4df8-9b1b-f233484cb285"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.548566 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.548726 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "83dc9a0f-80b3-4df8-9b1b-f233484cb285" (UID: "83dc9a0f-80b3-4df8-9b1b-f233484cb285"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.558135 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "83dc9a0f-80b3-4df8-9b1b-f233484cb285" (UID: "83dc9a0f-80b3-4df8-9b1b-f233484cb285"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.561219 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83dc9a0f-80b3-4df8-9b1b-f233484cb285-kube-api-access-5bpm8" (OuterVolumeSpecName: "kube-api-access-5bpm8") pod "83dc9a0f-80b3-4df8-9b1b-f233484cb285" (UID: "83dc9a0f-80b3-4df8-9b1b-f233484cb285"). InnerVolumeSpecName "kube-api-access-5bpm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.561969 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-scripts" (OuterVolumeSpecName: "scripts") pod "83dc9a0f-80b3-4df8-9b1b-f233484cb285" (UID: "83dc9a0f-80b3-4df8-9b1b-f233484cb285"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.608384 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83dc9a0f-80b3-4df8-9b1b-f233484cb285" (UID: "83dc9a0f-80b3-4df8-9b1b-f233484cb285"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.616193 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "83dc9a0f-80b3-4df8-9b1b-f233484cb285" (UID: "83dc9a0f-80b3-4df8-9b1b-f233484cb285"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.621274 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-config-data" (OuterVolumeSpecName: "config-data") pod "83dc9a0f-80b3-4df8-9b1b-f233484cb285" (UID: "83dc9a0f-80b3-4df8-9b1b-f233484cb285"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.650529 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.650560 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bpm8\" (UniqueName: \"kubernetes.io/projected/83dc9a0f-80b3-4df8-9b1b-f233484cb285-kube-api-access-5bpm8\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.650572 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.650606 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.650616 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.650624 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dc9a0f-80b3-4df8-9b1b-f233484cb285-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.650633 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83dc9a0f-80b3-4df8-9b1b-f233484cb285-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.676743 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.752176 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:36 crc kubenswrapper[4903]: I0320 08:46:36.858414 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:46:36 crc kubenswrapper[4903]: W0320 08:46:36.864123 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c33b2cd_e705_41cd_9e59_3dcbb0a55829.slice/crio-915ba21af04025013209005f6a27dbcc46d5120084dbd26714f30f808a165887 WatchSource:0}: Error finding container 915ba21af04025013209005f6a27dbcc46d5120084dbd26714f30f808a165887: Status 404 returned error can't find the container with id 915ba21af04025013209005f6a27dbcc46d5120084dbd26714f30f808a165887 Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.125851 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c33b2cd-e705-41cd-9e59-3dcbb0a55829","Type":"ContainerStarted","Data":"915ba21af04025013209005f6a27dbcc46d5120084dbd26714f30f808a165887"} Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.130001 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83dc9a0f-80b3-4df8-9b1b-f233484cb285","Type":"ContainerDied","Data":"a06414ce03a090d2aa987403746bd9ea9df4b4e399d708dd419cf86b6257a50e"} Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.130182 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.132691 4903 generic.go:334] "Generic (PLEG): container finished" podID="6453cb9a-76a4-412f-9cb8-964c20a217ca" containerID="48910b95b55fa683cebc5b748d90b941ea8034a5f459c44d84750b8def8b501e" exitCode=0 Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.135256 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bdb6dfbd4-xpx45" event={"ID":"6453cb9a-76a4-412f-9cb8-964c20a217ca","Type":"ContainerDied","Data":"48910b95b55fa683cebc5b748d90b941ea8034a5f459c44d84750b8def8b501e"} Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.135297 4903 scope.go:117] "RemoveContainer" containerID="40b4b5aa3e9d277b5d7629a27ef27b096fa2b070096c494efac9d95b45d321d0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.144480 4903 generic.go:334] "Generic (PLEG): container finished" podID="4b11fccf-eea3-450d-b460-013086becb0c" containerID="f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b" exitCode=0 Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.144528 4903 generic.go:334] "Generic (PLEG): container finished" podID="4b11fccf-eea3-450d-b460-013086becb0c" containerID="8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623" exitCode=2 Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.144542 4903 generic.go:334] "Generic (PLEG): container finished" podID="4b11fccf-eea3-450d-b460-013086becb0c" containerID="f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a" exitCode=0 Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.144556 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b11fccf-eea3-450d-b460-013086becb0c","Type":"ContainerDied","Data":"f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b"} Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.144630 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b11fccf-eea3-450d-b460-013086becb0c","Type":"ContainerDied","Data":"8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623"} Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.144644 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b11fccf-eea3-450d-b460-013086becb0c","Type":"ContainerDied","Data":"f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a"} Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.194699 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.204901 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.206198 4903 scope.go:117] "RemoveContainer" containerID="3b5d10559c5d44263b5a6e122434a5c7503ceaee8c0fcc56d0ab6b2eb14c36dd" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.230099 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:46:37 crc kubenswrapper[4903]: E0320 08:46:37.230489 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" containerName="glance-httpd" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.230502 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" containerName="glance-httpd" Mar 20 08:46:37 crc kubenswrapper[4903]: E0320 08:46:37.230547 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" containerName="glance-log" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.230552 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" containerName="glance-log" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.230716 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" containerName="glance-httpd" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.230733 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" containerName="glance-log" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.231672 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.234252 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.234707 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.248444 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.280798 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-logs\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.280843 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.280910 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.280932 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.280970 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.280987 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.281054 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.281167 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq4mh\" (UniqueName: \"kubernetes.io/projected/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-kube-api-access-nq4mh\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.382321 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq4mh\" (UniqueName: \"kubernetes.io/projected/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-kube-api-access-nq4mh\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.382404 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-logs\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.382430 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.382465 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.382490 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.382530 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.382548 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.382600 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.383337 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-logs\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.383335 4903 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.383426 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.390079 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.391060 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.399240 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.399952 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq4mh\" (UniqueName: \"kubernetes.io/projected/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-kube-api-access-nq4mh\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.404073 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.419714 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.514992 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83dc9a0f-80b3-4df8-9b1b-f233484cb285" path="/var/lib/kubelet/pods/83dc9a0f-80b3-4df8-9b1b-f233484cb285/volumes" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.516484 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e761f2dc-041e-4811-802b-2a8e8c376381" path="/var/lib/kubelet/pods/e761f2dc-041e-4811-802b-2a8e8c376381/volumes" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.607278 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.805707 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.901576 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-log-httpd\") pod \"4b11fccf-eea3-450d-b460-013086becb0c\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.902417 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b11fccf-eea3-450d-b460-013086becb0c" (UID: "4b11fccf-eea3-450d-b460-013086becb0c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.902499 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-config-data\") pod \"4b11fccf-eea3-450d-b460-013086becb0c\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.903244 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlv8v\" (UniqueName: \"kubernetes.io/projected/4b11fccf-eea3-450d-b460-013086becb0c-kube-api-access-zlv8v\") pod \"4b11fccf-eea3-450d-b460-013086becb0c\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.903425 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-run-httpd\") pod \"4b11fccf-eea3-450d-b460-013086becb0c\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.903512 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-combined-ca-bundle\") pod \"4b11fccf-eea3-450d-b460-013086becb0c\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.903566 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-sg-core-conf-yaml\") pod \"4b11fccf-eea3-450d-b460-013086becb0c\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.903628 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-scripts\") pod \"4b11fccf-eea3-450d-b460-013086becb0c\" (UID: \"4b11fccf-eea3-450d-b460-013086becb0c\") " Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.904359 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.906298 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b11fccf-eea3-450d-b460-013086becb0c" (UID: "4b11fccf-eea3-450d-b460-013086becb0c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.913531 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b11fccf-eea3-450d-b460-013086becb0c-kube-api-access-zlv8v" (OuterVolumeSpecName: "kube-api-access-zlv8v") pod "4b11fccf-eea3-450d-b460-013086becb0c" (UID: "4b11fccf-eea3-450d-b460-013086becb0c"). InnerVolumeSpecName "kube-api-access-zlv8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.919280 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-scripts" (OuterVolumeSpecName: "scripts") pod "4b11fccf-eea3-450d-b460-013086becb0c" (UID: "4b11fccf-eea3-450d-b460-013086becb0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:37 crc kubenswrapper[4903]: I0320 08:46:37.949247 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b11fccf-eea3-450d-b460-013086becb0c" (UID: "4b11fccf-eea3-450d-b460-013086becb0c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.006285 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b11fccf-eea3-450d-b460-013086becb0c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.006530 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.006609 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.006665 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlv8v\" (UniqueName: \"kubernetes.io/projected/4b11fccf-eea3-450d-b460-013086becb0c-kube-api-access-zlv8v\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.035256 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b11fccf-eea3-450d-b460-013086becb0c" (UID: "4b11fccf-eea3-450d-b460-013086becb0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.097739 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-config-data" (OuterVolumeSpecName: "config-data") pod "4b11fccf-eea3-450d-b460-013086becb0c" (UID: "4b11fccf-eea3-450d-b460-013086becb0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.108324 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.108361 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b11fccf-eea3-450d-b460-013086becb0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.159582 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c33b2cd-e705-41cd-9e59-3dcbb0a55829","Type":"ContainerStarted","Data":"e7046c98c3546d6698b9ba6c5237b460ed8efe52b7e38c2e607164140f59d3d5"} Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.171389 4903 generic.go:334] "Generic (PLEG): container finished" podID="4b11fccf-eea3-450d-b460-013086becb0c" containerID="09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab" exitCode=0 Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.171440 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.171461 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b11fccf-eea3-450d-b460-013086becb0c","Type":"ContainerDied","Data":"09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab"} Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.171506 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b11fccf-eea3-450d-b460-013086becb0c","Type":"ContainerDied","Data":"bfc7cf95e092b9e7eb9f528611dbcf47b438ab66e51a94cb8fec02bc4dccda14"} Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.171530 4903 scope.go:117] "RemoveContainer" containerID="f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.211752 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.235179 4903 scope.go:117] "RemoveContainer" containerID="8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.254740 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.318095 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:38 crc kubenswrapper[4903]: E0320 08:46:38.318624 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="ceilometer-notification-agent" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.318641 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="ceilometer-notification-agent" Mar 20 08:46:38 crc kubenswrapper[4903]: E0320 08:46:38.318663 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="proxy-httpd" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.318671 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="proxy-httpd" Mar 20 08:46:38 crc kubenswrapper[4903]: E0320 08:46:38.318677 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="ceilometer-central-agent" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.318684 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="ceilometer-central-agent" Mar 20 08:46:38 crc kubenswrapper[4903]: E0320 08:46:38.318703 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="sg-core" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.318712 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="sg-core" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.318890 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="proxy-httpd" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.318905 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="sg-core" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.318915 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="ceilometer-notification-agent" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.318925 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b11fccf-eea3-450d-b460-013086becb0c" containerName="ceilometer-central-agent" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.320708 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.324315 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.324508 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.328000 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.358363 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.385232 4903 scope.go:117] "RemoveContainer" containerID="f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.418330 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-config-data\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.418427 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.418454 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-log-httpd\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.418500 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.418537 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-run-httpd\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.418556 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-scripts\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.418599 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txdxs\" (UniqueName: \"kubernetes.io/projected/ff89c989-6e0a-42bb-98b6-f1b7e0289604-kube-api-access-txdxs\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.418614 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.455184 4903 scope.go:117] "RemoveContainer" containerID="09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.480632 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.520663 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.520732 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-run-httpd\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.520758 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-scripts\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.520804 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txdxs\" (UniqueName: \"kubernetes.io/projected/ff89c989-6e0a-42bb-98b6-f1b7e0289604-kube-api-access-txdxs\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.520822 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.520846 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-config-data\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.520907 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.520932 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-log-httpd\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.521392 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-log-httpd\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.523843 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-run-httpd\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.550839 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.565801 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-scripts\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.567466 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.576827 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.577834 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-config-data\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.580874 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txdxs\" (UniqueName: \"kubernetes.io/projected/ff89c989-6e0a-42bb-98b6-f1b7e0289604-kube-api-access-txdxs\") pod \"ceilometer-0\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " pod="openstack/ceilometer-0" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.583259 4903 scope.go:117] "RemoveContainer" containerID="f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b" Mar 20 08:46:38 crc kubenswrapper[4903]: E0320 08:46:38.583905 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b\": container with ID starting with f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b not found: ID does not exist" containerID="f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.583941 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b"} err="failed to get container status \"f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b\": rpc error: code = NotFound desc = could not find container \"f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b\": container with ID starting with f5d6e986dfa73b3a3f49723209dafdda83c4c64eeb89b74b60cca6611cceb00b not found: ID does not exist" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.583964 4903 scope.go:117] "RemoveContainer" containerID="8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623" Mar 20 08:46:38 crc kubenswrapper[4903]: E0320 08:46:38.585706 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623\": container with ID starting with 8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623 not found: ID does not exist" containerID="8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.585738 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623"} err="failed to get container status \"8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623\": rpc error: code = NotFound desc = could not find container \"8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623\": container with ID starting with 8232a6bcbff9ffc96024d73c35e8fd73728b55a64cc9fd21c14192f3307f2623 not found: ID does not exist" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.585754 4903 scope.go:117] "RemoveContainer" containerID="f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a" Mar 20 08:46:38 crc kubenswrapper[4903]: E0320 08:46:38.586123 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a\": container with ID starting with f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a not found: ID does not exist" containerID="f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.586170 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a"} err="failed to get container status \"f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a\": rpc error: code = NotFound desc = could not find container \"f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a\": container with ID starting with f2bf3db3a69c408a2742f89249166339c6ac9359d46b3b581afd7322538aed8a not found: ID does not exist" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.586207 4903 scope.go:117] "RemoveContainer" containerID="09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab" Mar 20 08:46:38 crc kubenswrapper[4903]: E0320 08:46:38.586463 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab\": container with ID starting with 09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab not found: ID does not exist" containerID="09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.586497 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab"} err="failed to get container status \"09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab\": rpc error: code = NotFound desc = could not find container \"09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab\": container with ID starting with 09d0f8d07a129edf0703805433590c950927e505f25614bbc396130dc7d3efab not found: ID does not exist" Mar 20 08:46:38 crc kubenswrapper[4903]: I0320 08:46:38.648764 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.151623 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:39 crc kubenswrapper[4903]: W0320 08:46:39.173259 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff89c989_6e0a_42bb_98b6_f1b7e0289604.slice/crio-79bf7cb015ad041eec4b51c7ebb6ee03c794dd00024b9cb65a86ecbbbb23bf4d WatchSource:0}: Error finding container 79bf7cb015ad041eec4b51c7ebb6ee03c794dd00024b9cb65a86ecbbbb23bf4d: Status 404 returned error can't find the container with id 79bf7cb015ad041eec4b51c7ebb6ee03c794dd00024b9cb65a86ecbbbb23bf4d Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.186014 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679","Type":"ContainerStarted","Data":"e8f4896be311871f3ac0108063c58ec2c400c0620e5af12b5e7ff951a80e4061"} Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.187936 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c33b2cd-e705-41cd-9e59-3dcbb0a55829","Type":"ContainerStarted","Data":"036996cd6559515dd00dacbdb33836ee1417dee1983310445b9597aed681a5db"} Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.213907 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.213887745 podStartE2EDuration="4.213887745s" podCreationTimestamp="2026-03-20 08:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:46:39.204335523 +0000 UTC m=+1424.421235838" watchObservedRunningTime="2026-03-20 08:46:39.213887745 +0000 UTC m=+1424.430788060" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.515315 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b11fccf-eea3-450d-b460-013086becb0c" path="/var/lib/kubelet/pods/4b11fccf-eea3-450d-b460-013086becb0c/volumes" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.659388 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sxx4b"] Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.661281 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.664231 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.664234 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7wjch" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.664474 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.673547 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sxx4b"] Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.755651 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-config-data\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.756277 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmmk5\" (UniqueName: \"kubernetes.io/projected/4bb8fa20-aceb-4b12-b104-e1594993b20c-kube-api-access-rmmk5\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.756684 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-scripts\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.756762 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.858662 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmmk5\" (UniqueName: \"kubernetes.io/projected/4bb8fa20-aceb-4b12-b104-e1594993b20c-kube-api-access-rmmk5\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.858769 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-scripts\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.858800 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.858839 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-config-data\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.864350 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-config-data\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.864483 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.868608 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-scripts\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.877097 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmmk5\" (UniqueName: \"kubernetes.io/projected/4bb8fa20-aceb-4b12-b104-e1594993b20c-kube-api-access-rmmk5\") pod \"nova-cell0-conductor-db-sync-sxx4b\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:39 crc kubenswrapper[4903]: I0320 08:46:39.998439 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.269387 4903 generic.go:334] "Generic (PLEG): container finished" podID="6453cb9a-76a4-412f-9cb8-964c20a217ca" containerID="c8a35f3396f6369e4c3eda1b193db1137374b7218f722935ca859014ff6167e1" exitCode=0 Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.269578 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bdb6dfbd4-xpx45" event={"ID":"6453cb9a-76a4-412f-9cb8-964c20a217ca","Type":"ContainerDied","Data":"c8a35f3396f6369e4c3eda1b193db1137374b7218f722935ca859014ff6167e1"} Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.274544 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff89c989-6e0a-42bb-98b6-f1b7e0289604","Type":"ContainerStarted","Data":"5f1d0a1d934370382a9b94db03418abe63c879ba6699c7044da96e96f1689818"} Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.274591 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff89c989-6e0a-42bb-98b6-f1b7e0289604","Type":"ContainerStarted","Data":"79bf7cb015ad041eec4b51c7ebb6ee03c794dd00024b9cb65a86ecbbbb23bf4d"} Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.279765 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679","Type":"ContainerStarted","Data":"8c4db124274a0f1bc8c0540b96509dad9dbb16ca409f3c709b225f29f322a39d"} Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.279839 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679","Type":"ContainerStarted","Data":"e9d3a5c1b4f80a807d17559887982f63b2f87da65ce30784f6d98d67b9f43363"} Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.307699 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.307669249 podStartE2EDuration="3.307669249s" podCreationTimestamp="2026-03-20 08:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:46:40.30211251 +0000 UTC m=+1425.519012825" watchObservedRunningTime="2026-03-20 08:46:40.307669249 +0000 UTC m=+1425.524569564" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.573076 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sxx4b"] Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.664774 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.776543 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-httpd-config\") pod \"6453cb9a-76a4-412f-9cb8-964c20a217ca\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.777222 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-config\") pod \"6453cb9a-76a4-412f-9cb8-964c20a217ca\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.777294 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-ovndb-tls-certs\") pod \"6453cb9a-76a4-412f-9cb8-964c20a217ca\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.777510 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-combined-ca-bundle\") pod \"6453cb9a-76a4-412f-9cb8-964c20a217ca\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.777709 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqlb6\" (UniqueName: \"kubernetes.io/projected/6453cb9a-76a4-412f-9cb8-964c20a217ca-kube-api-access-vqlb6\") pod \"6453cb9a-76a4-412f-9cb8-964c20a217ca\" (UID: \"6453cb9a-76a4-412f-9cb8-964c20a217ca\") " Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.786681 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6453cb9a-76a4-412f-9cb8-964c20a217ca-kube-api-access-vqlb6" (OuterVolumeSpecName: "kube-api-access-vqlb6") pod "6453cb9a-76a4-412f-9cb8-964c20a217ca" (UID: "6453cb9a-76a4-412f-9cb8-964c20a217ca"). InnerVolumeSpecName "kube-api-access-vqlb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.791142 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6453cb9a-76a4-412f-9cb8-964c20a217ca" (UID: "6453cb9a-76a4-412f-9cb8-964c20a217ca"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.838534 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6453cb9a-76a4-412f-9cb8-964c20a217ca" (UID: "6453cb9a-76a4-412f-9cb8-964c20a217ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.848448 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-config" (OuterVolumeSpecName: "config") pod "6453cb9a-76a4-412f-9cb8-964c20a217ca" (UID: "6453cb9a-76a4-412f-9cb8-964c20a217ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.880433 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.880473 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.880483 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqlb6\" (UniqueName: \"kubernetes.io/projected/6453cb9a-76a4-412f-9cb8-964c20a217ca-kube-api-access-vqlb6\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.880491 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.880721 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6453cb9a-76a4-412f-9cb8-964c20a217ca" (UID: "6453cb9a-76a4-412f-9cb8-964c20a217ca"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:40 crc kubenswrapper[4903]: I0320 08:46:40.983377 4903 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6453cb9a-76a4-412f-9cb8-964c20a217ca-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:41 crc kubenswrapper[4903]: I0320 08:46:41.295380 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bdb6dfbd4-xpx45" Mar 20 08:46:41 crc kubenswrapper[4903]: I0320 08:46:41.296528 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bdb6dfbd4-xpx45" event={"ID":"6453cb9a-76a4-412f-9cb8-964c20a217ca","Type":"ContainerDied","Data":"f2fb1e2ccded77e9a488b0947185a52d6b9748c3171c7dff351f3f511ca16cf7"} Mar 20 08:46:41 crc kubenswrapper[4903]: I0320 08:46:41.296596 4903 scope.go:117] "RemoveContainer" containerID="48910b95b55fa683cebc5b748d90b941ea8034a5f459c44d84750b8def8b501e" Mar 20 08:46:41 crc kubenswrapper[4903]: I0320 08:46:41.299303 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sxx4b" event={"ID":"4bb8fa20-aceb-4b12-b104-e1594993b20c","Type":"ContainerStarted","Data":"c8c293b6abe5051bd8f0560c106394eacc6eb88a990972b00e89e128924e2787"} Mar 20 08:46:41 crc kubenswrapper[4903]: I0320 08:46:41.304464 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff89c989-6e0a-42bb-98b6-f1b7e0289604","Type":"ContainerStarted","Data":"b801b65f5a48e40638a6375d4708c7a627f9d68aa9aaf2766e8b926fd2c07440"} Mar 20 08:46:41 crc kubenswrapper[4903]: I0320 08:46:41.343631 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bdb6dfbd4-xpx45"] Mar 20 08:46:41 crc kubenswrapper[4903]: I0320 08:46:41.353889 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7bdb6dfbd4-xpx45"] Mar 20 08:46:41 crc kubenswrapper[4903]: I0320 08:46:41.357654 4903 scope.go:117] "RemoveContainer" containerID="c8a35f3396f6369e4c3eda1b193db1137374b7218f722935ca859014ff6167e1" Mar 20 08:46:41 crc kubenswrapper[4903]: I0320 08:46:41.513316 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6453cb9a-76a4-412f-9cb8-964c20a217ca" path="/var/lib/kubelet/pods/6453cb9a-76a4-412f-9cb8-964c20a217ca/volumes" Mar 20 08:46:42 crc kubenswrapper[4903]: I0320 08:46:42.352061 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff89c989-6e0a-42bb-98b6-f1b7e0289604","Type":"ContainerStarted","Data":"0099c5fd8858bed38df7e47718d27798aabf6dca737fb1eba2406be69c453b90"} Mar 20 08:46:42 crc kubenswrapper[4903]: I0320 08:46:42.417640 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 08:46:42 crc kubenswrapper[4903]: I0320 08:46:42.597162 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:42 crc kubenswrapper[4903]: I0320 08:46:42.672221 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:46:42 crc kubenswrapper[4903]: I0320 08:46:42.747816 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78d46df76d-rh79h"] Mar 20 08:46:42 crc kubenswrapper[4903]: I0320 08:46:42.748110 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78d46df76d-rh79h" podUID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" containerName="placement-log" containerID="cri-o://cb4f4de67e2cde7b1eced841f79e2032becde10069a1a66d2e124e4fe96dfea5" gracePeriod=30 Mar 20 08:46:42 crc kubenswrapper[4903]: I0320 08:46:42.748295 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78d46df76d-rh79h" podUID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" containerName="placement-api" containerID="cri-o://f03767fef05a731d00ae668183b5270fada9ddab13c73cdefd1db6d77b290014" gracePeriod=30 Mar 20 08:46:43 crc kubenswrapper[4903]: I0320 08:46:43.382323 4903 generic.go:334] "Generic (PLEG): container finished" podID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" containerID="cb4f4de67e2cde7b1eced841f79e2032becde10069a1a66d2e124e4fe96dfea5" exitCode=143 Mar 20 08:46:43 crc kubenswrapper[4903]: I0320 08:46:43.382406 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78d46df76d-rh79h" event={"ID":"edfd7894-cb6d-43bb-87ab-289c00d2a8f7","Type":"ContainerDied","Data":"cb4f4de67e2cde7b1eced841f79e2032becde10069a1a66d2e124e4fe96dfea5"} Mar 20 08:46:44 crc kubenswrapper[4903]: I0320 08:46:44.400806 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff89c989-6e0a-42bb-98b6-f1b7e0289604","Type":"ContainerStarted","Data":"aa24af5b8127749cccf8645f0fc9671e1a3b4731752ecc47e0f448864f20f650"} Mar 20 08:46:44 crc kubenswrapper[4903]: I0320 08:46:44.401069 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 08:46:44 crc kubenswrapper[4903]: I0320 08:46:44.425401 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.883623531 podStartE2EDuration="6.425382547s" podCreationTimestamp="2026-03-20 08:46:38 +0000 UTC" firstStartedPulling="2026-03-20 08:46:39.178247605 +0000 UTC m=+1424.395147920" lastFinishedPulling="2026-03-20 08:46:43.720006631 +0000 UTC m=+1428.936906936" observedRunningTime="2026-03-20 08:46:44.421878115 +0000 UTC m=+1429.638778440" watchObservedRunningTime="2026-03-20 08:46:44.425382547 +0000 UTC m=+1429.642282862" Mar 20 08:46:44 crc kubenswrapper[4903]: I0320 08:46:44.847429 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:45 crc kubenswrapper[4903]: I0320 08:46:45.034018 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.048380 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.049804 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.094337 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.105243 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.432476 4903 generic.go:334] "Generic (PLEG): container finished" podID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" containerID="f03767fef05a731d00ae668183b5270fada9ddab13c73cdefd1db6d77b290014" exitCode=0 Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.432737 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="ceilometer-central-agent" containerID="cri-o://5f1d0a1d934370382a9b94db03418abe63c879ba6699c7044da96e96f1689818" gracePeriod=30 Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.432815 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78d46df76d-rh79h" event={"ID":"edfd7894-cb6d-43bb-87ab-289c00d2a8f7","Type":"ContainerDied","Data":"f03767fef05a731d00ae668183b5270fada9ddab13c73cdefd1db6d77b290014"} Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.433260 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="ceilometer-notification-agent" containerID="cri-o://b801b65f5a48e40638a6375d4708c7a627f9d68aa9aaf2766e8b926fd2c07440" gracePeriod=30 Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.433317 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="proxy-httpd" containerID="cri-o://aa24af5b8127749cccf8645f0fc9671e1a3b4731752ecc47e0f448864f20f650" gracePeriod=30 Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.433332 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="sg-core" containerID="cri-o://0099c5fd8858bed38df7e47718d27798aabf6dca737fb1eba2406be69c453b90" gracePeriod=30 Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.433653 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 08:46:46 crc kubenswrapper[4903]: I0320 08:46:46.433684 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 08:46:47 crc kubenswrapper[4903]: I0320 08:46:47.464661 4903 generic.go:334] "Generic (PLEG): container finished" podID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerID="aa24af5b8127749cccf8645f0fc9671e1a3b4731752ecc47e0f448864f20f650" exitCode=0 Mar 20 08:46:47 crc kubenswrapper[4903]: I0320 08:46:47.465642 4903 generic.go:334] "Generic (PLEG): container finished" podID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerID="0099c5fd8858bed38df7e47718d27798aabf6dca737fb1eba2406be69c453b90" exitCode=2 Mar 20 08:46:47 crc kubenswrapper[4903]: I0320 08:46:47.465658 4903 generic.go:334] "Generic (PLEG): container finished" podID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerID="b801b65f5a48e40638a6375d4708c7a627f9d68aa9aaf2766e8b926fd2c07440" exitCode=0 Mar 20 08:46:47 crc kubenswrapper[4903]: I0320 08:46:47.464739 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff89c989-6e0a-42bb-98b6-f1b7e0289604","Type":"ContainerDied","Data":"aa24af5b8127749cccf8645f0fc9671e1a3b4731752ecc47e0f448864f20f650"} Mar 20 08:46:47 crc kubenswrapper[4903]: I0320 08:46:47.465992 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff89c989-6e0a-42bb-98b6-f1b7e0289604","Type":"ContainerDied","Data":"0099c5fd8858bed38df7e47718d27798aabf6dca737fb1eba2406be69c453b90"} Mar 20 08:46:47 crc kubenswrapper[4903]: I0320 08:46:47.466023 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff89c989-6e0a-42bb-98b6-f1b7e0289604","Type":"ContainerDied","Data":"b801b65f5a48e40638a6375d4708c7a627f9d68aa9aaf2766e8b926fd2c07440"} Mar 20 08:46:47 crc kubenswrapper[4903]: I0320 08:46:47.609286 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:47 crc kubenswrapper[4903]: I0320 08:46:47.609350 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:47 crc kubenswrapper[4903]: I0320 08:46:47.656843 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:47 crc kubenswrapper[4903]: I0320 08:46:47.685281 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:48 crc kubenswrapper[4903]: I0320 08:46:48.449131 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 08:46:48 crc kubenswrapper[4903]: I0320 08:46:48.449828 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 08:46:48 crc kubenswrapper[4903]: I0320 08:46:48.491671 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:48 crc kubenswrapper[4903]: I0320 08:46:48.491724 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:50 crc kubenswrapper[4903]: I0320 08:46:50.696475 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:50 crc kubenswrapper[4903]: I0320 08:46:50.696910 4903 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:46:50 crc kubenswrapper[4903]: I0320 08:46:50.826604 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.472886 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.543254 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sxx4b" event={"ID":"4bb8fa20-aceb-4b12-b104-e1594993b20c","Type":"ContainerStarted","Data":"dc0246b5033cba2412fe3761cbb94c3af09040b7c4ae5b202375dede12eaaf53"} Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.549984 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78d46df76d-rh79h" event={"ID":"edfd7894-cb6d-43bb-87ab-289c00d2a8f7","Type":"ContainerDied","Data":"0353a6ac7bf0dbb9ec86bef39422ab09771546f9a25a1b0e7d76b4e27cd00f31"} Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.550060 4903 scope.go:117] "RemoveContainer" containerID="f03767fef05a731d00ae668183b5270fada9ddab13c73cdefd1db6d77b290014" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.550294 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78d46df76d-rh79h" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.568747 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-sxx4b" podStartSLOduration=1.945969914 podStartE2EDuration="13.568728289s" podCreationTimestamp="2026-03-20 08:46:39 +0000 UTC" firstStartedPulling="2026-03-20 08:46:40.588910664 +0000 UTC m=+1425.805810979" lastFinishedPulling="2026-03-20 08:46:52.211669029 +0000 UTC m=+1437.428569354" observedRunningTime="2026-03-20 08:46:52.56706244 +0000 UTC m=+1437.783962745" watchObservedRunningTime="2026-03-20 08:46:52.568728289 +0000 UTC m=+1437.785628604" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.576299 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-public-tls-certs\") pod \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.576428 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-combined-ca-bundle\") pod \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.576544 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-internal-tls-certs\") pod \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.576581 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-logs\") pod \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.576860 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4q9k\" (UniqueName: \"kubernetes.io/projected/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-kube-api-access-c4q9k\") pod \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.576891 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-scripts\") pod \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.576941 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-config-data\") pod \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\" (UID: \"edfd7894-cb6d-43bb-87ab-289c00d2a8f7\") " Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.577272 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-logs" (OuterVolumeSpecName: "logs") pod "edfd7894-cb6d-43bb-87ab-289c00d2a8f7" (UID: "edfd7894-cb6d-43bb-87ab-289c00d2a8f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.577920 4903 scope.go:117] "RemoveContainer" containerID="cb4f4de67e2cde7b1eced841f79e2032becde10069a1a66d2e124e4fe96dfea5" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.577967 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.583258 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-kube-api-access-c4q9k" (OuterVolumeSpecName: "kube-api-access-c4q9k") pod "edfd7894-cb6d-43bb-87ab-289c00d2a8f7" (UID: "edfd7894-cb6d-43bb-87ab-289c00d2a8f7"). InnerVolumeSpecName "kube-api-access-c4q9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.584345 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-scripts" (OuterVolumeSpecName: "scripts") pod "edfd7894-cb6d-43bb-87ab-289c00d2a8f7" (UID: "edfd7894-cb6d-43bb-87ab-289c00d2a8f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.647192 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edfd7894-cb6d-43bb-87ab-289c00d2a8f7" (UID: "edfd7894-cb6d-43bb-87ab-289c00d2a8f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.662766 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-config-data" (OuterVolumeSpecName: "config-data") pod "edfd7894-cb6d-43bb-87ab-289c00d2a8f7" (UID: "edfd7894-cb6d-43bb-87ab-289c00d2a8f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.693329 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4q9k\" (UniqueName: \"kubernetes.io/projected/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-kube-api-access-c4q9k\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.693364 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.693381 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.693393 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.695088 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "edfd7894-cb6d-43bb-87ab-289c00d2a8f7" (UID: "edfd7894-cb6d-43bb-87ab-289c00d2a8f7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.695931 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "edfd7894-cb6d-43bb-87ab-289c00d2a8f7" (UID: "edfd7894-cb6d-43bb-87ab-289c00d2a8f7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.795563 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.795614 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edfd7894-cb6d-43bb-87ab-289c00d2a8f7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.910617 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78d46df76d-rh79h"] Mar 20 08:46:52 crc kubenswrapper[4903]: I0320 08:46:52.922479 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-78d46df76d-rh79h"] Mar 20 08:46:53 crc kubenswrapper[4903]: I0320 08:46:53.514833 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" path="/var/lib/kubelet/pods/edfd7894-cb6d-43bb-87ab-289c00d2a8f7/volumes" Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.579286 4903 generic.go:334] "Generic (PLEG): container finished" podID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerID="5f1d0a1d934370382a9b94db03418abe63c879ba6699c7044da96e96f1689818" exitCode=0 Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.579353 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff89c989-6e0a-42bb-98b6-f1b7e0289604","Type":"ContainerDied","Data":"5f1d0a1d934370382a9b94db03418abe63c879ba6699c7044da96e96f1689818"} Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.807934 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.945199 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-config-data\") pod \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.945311 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-run-httpd\") pod \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.945366 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-combined-ca-bundle\") pod \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.945444 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-ceilometer-tls-certs\") pod \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.945515 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-log-httpd\") pod \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.945557 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-scripts\") pod \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.945696 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-sg-core-conf-yaml\") pod \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.945749 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txdxs\" (UniqueName: \"kubernetes.io/projected/ff89c989-6e0a-42bb-98b6-f1b7e0289604-kube-api-access-txdxs\") pod \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\" (UID: \"ff89c989-6e0a-42bb-98b6-f1b7e0289604\") " Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.945866 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ff89c989-6e0a-42bb-98b6-f1b7e0289604" (UID: "ff89c989-6e0a-42bb-98b6-f1b7e0289604"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.946144 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ff89c989-6e0a-42bb-98b6-f1b7e0289604" (UID: "ff89c989-6e0a-42bb-98b6-f1b7e0289604"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.946244 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.952158 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-scripts" (OuterVolumeSpecName: "scripts") pod "ff89c989-6e0a-42bb-98b6-f1b7e0289604" (UID: "ff89c989-6e0a-42bb-98b6-f1b7e0289604"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.952572 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff89c989-6e0a-42bb-98b6-f1b7e0289604-kube-api-access-txdxs" (OuterVolumeSpecName: "kube-api-access-txdxs") pod "ff89c989-6e0a-42bb-98b6-f1b7e0289604" (UID: "ff89c989-6e0a-42bb-98b6-f1b7e0289604"). InnerVolumeSpecName "kube-api-access-txdxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:46:54 crc kubenswrapper[4903]: I0320 08:46:54.980780 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ff89c989-6e0a-42bb-98b6-f1b7e0289604" (UID: "ff89c989-6e0a-42bb-98b6-f1b7e0289604"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.010657 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ff89c989-6e0a-42bb-98b6-f1b7e0289604" (UID: "ff89c989-6e0a-42bb-98b6-f1b7e0289604"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.037803 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff89c989-6e0a-42bb-98b6-f1b7e0289604" (UID: "ff89c989-6e0a-42bb-98b6-f1b7e0289604"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.048456 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.048496 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txdxs\" (UniqueName: \"kubernetes.io/projected/ff89c989-6e0a-42bb-98b6-f1b7e0289604-kube-api-access-txdxs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.048509 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.048517 4903 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.048527 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff89c989-6e0a-42bb-98b6-f1b7e0289604-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.048535 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.075483 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-config-data" (OuterVolumeSpecName: "config-data") pod "ff89c989-6e0a-42bb-98b6-f1b7e0289604" (UID: "ff89c989-6e0a-42bb-98b6-f1b7e0289604"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.149974 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff89c989-6e0a-42bb-98b6-f1b7e0289604-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.593712 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff89c989-6e0a-42bb-98b6-f1b7e0289604","Type":"ContainerDied","Data":"79bf7cb015ad041eec4b51c7ebb6ee03c794dd00024b9cb65a86ecbbbb23bf4d"} Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.593869 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.594162 4903 scope.go:117] "RemoveContainer" containerID="aa24af5b8127749cccf8645f0fc9671e1a3b4731752ecc47e0f448864f20f650" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.646018 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.660607 4903 scope.go:117] "RemoveContainer" containerID="0099c5fd8858bed38df7e47718d27798aabf6dca737fb1eba2406be69c453b90" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.664385 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.708904 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:55 crc kubenswrapper[4903]: E0320 08:46:55.710177 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="ceilometer-notification-agent" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710202 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="ceilometer-notification-agent" Mar 20 08:46:55 crc kubenswrapper[4903]: E0320 08:46:55.710228 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" containerName="placement-log" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710238 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" containerName="placement-log" Mar 20 08:46:55 crc kubenswrapper[4903]: E0320 08:46:55.710264 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6453cb9a-76a4-412f-9cb8-964c20a217ca" containerName="neutron-api" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710273 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6453cb9a-76a4-412f-9cb8-964c20a217ca" containerName="neutron-api" Mar 20 08:46:55 crc kubenswrapper[4903]: E0320 08:46:55.710299 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" containerName="placement-api" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710309 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" containerName="placement-api" Mar 20 08:46:55 crc kubenswrapper[4903]: E0320 08:46:55.710338 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="ceilometer-central-agent" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710348 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="ceilometer-central-agent" Mar 20 08:46:55 crc kubenswrapper[4903]: E0320 08:46:55.710372 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6453cb9a-76a4-412f-9cb8-964c20a217ca" containerName="neutron-httpd" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710380 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6453cb9a-76a4-412f-9cb8-964c20a217ca" containerName="neutron-httpd" Mar 20 08:46:55 crc kubenswrapper[4903]: E0320 08:46:55.710393 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="sg-core" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710403 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="sg-core" Mar 20 08:46:55 crc kubenswrapper[4903]: E0320 08:46:55.710418 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="proxy-httpd" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710426 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="proxy-httpd" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710795 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="proxy-httpd" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710822 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" containerName="placement-api" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710841 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="ceilometer-central-agent" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710856 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6453cb9a-76a4-412f-9cb8-964c20a217ca" containerName="neutron-httpd" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710877 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="sg-core" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710894 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" containerName="ceilometer-notification-agent" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710906 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6453cb9a-76a4-412f-9cb8-964c20a217ca" containerName="neutron-api" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.710928 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="edfd7894-cb6d-43bb-87ab-289c00d2a8f7" containerName="placement-log" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.714589 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.717529 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.717603 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.718435 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.730126 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.768015 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.768101 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.768149 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-log-httpd\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.768335 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-run-httpd\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.768428 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-scripts\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.768503 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.768712 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-config-data\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.768855 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mql9b\" (UniqueName: \"kubernetes.io/projected/49906cb0-968b-403e-be6c-8c70d19149b1-kube-api-access-mql9b\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.871796 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.871866 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-config-data\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.871897 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mql9b\" (UniqueName: \"kubernetes.io/projected/49906cb0-968b-403e-be6c-8c70d19149b1-kube-api-access-mql9b\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.871936 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.871966 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.872000 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-log-httpd\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.872047 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-run-httpd\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.872068 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-scripts\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.873385 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-run-httpd\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.873649 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-log-httpd\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.881020 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-config-data\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.883838 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.885079 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-scripts\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.885234 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.885322 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:55 crc kubenswrapper[4903]: I0320 08:46:55.900781 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mql9b\" (UniqueName: \"kubernetes.io/projected/49906cb0-968b-403e-be6c-8c70d19149b1-kube-api-access-mql9b\") pod \"ceilometer-0\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " pod="openstack/ceilometer-0" Mar 20 08:46:56 crc kubenswrapper[4903]: I0320 08:46:56.044483 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:46:56 crc kubenswrapper[4903]: I0320 08:46:56.103591 4903 scope.go:117] "RemoveContainer" containerID="b801b65f5a48e40638a6375d4708c7a627f9d68aa9aaf2766e8b926fd2c07440" Mar 20 08:46:56 crc kubenswrapper[4903]: I0320 08:46:56.159138 4903 scope.go:117] "RemoveContainer" containerID="5f1d0a1d934370382a9b94db03418abe63c879ba6699c7044da96e96f1689818" Mar 20 08:46:56 crc kubenswrapper[4903]: W0320 08:46:56.669214 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49906cb0_968b_403e_be6c_8c70d19149b1.slice/crio-be836dcefa03542138421b21f4075bc18f7f7843aeb89968f1ab3e6322310ef5 WatchSource:0}: Error finding container be836dcefa03542138421b21f4075bc18f7f7843aeb89968f1ab3e6322310ef5: Status 404 returned error can't find the container with id be836dcefa03542138421b21f4075bc18f7f7843aeb89968f1ab3e6322310ef5 Mar 20 08:46:56 crc kubenswrapper[4903]: I0320 08:46:56.678579 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:46:57 crc kubenswrapper[4903]: I0320 08:46:57.501992 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff89c989-6e0a-42bb-98b6-f1b7e0289604" path="/var/lib/kubelet/pods/ff89c989-6e0a-42bb-98b6-f1b7e0289604/volumes" Mar 20 08:46:57 crc kubenswrapper[4903]: I0320 08:46:57.615353 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49906cb0-968b-403e-be6c-8c70d19149b1","Type":"ContainerStarted","Data":"fe2b934d226f8b0c4f2171f97f098b7e54c2fbb0b516781ded59427787e1f5a1"} Mar 20 08:46:57 crc kubenswrapper[4903]: I0320 08:46:57.615394 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49906cb0-968b-403e-be6c-8c70d19149b1","Type":"ContainerStarted","Data":"be836dcefa03542138421b21f4075bc18f7f7843aeb89968f1ab3e6322310ef5"} Mar 20 08:46:58 crc kubenswrapper[4903]: I0320 08:46:58.026415 4903 scope.go:117] "RemoveContainer" containerID="a3f295ce1810316f363f8d39f44fa7b3cdd0c100d6a0e205aaefff5415b5eea8" Mar 20 08:46:58 crc kubenswrapper[4903]: I0320 08:46:58.628864 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49906cb0-968b-403e-be6c-8c70d19149b1","Type":"ContainerStarted","Data":"f4ccdb0daad5518ec16ebef714bc1a3c7def637458eb65ff1b27be9e6647d62d"} Mar 20 08:46:59 crc kubenswrapper[4903]: I0320 08:46:59.642939 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49906cb0-968b-403e-be6c-8c70d19149b1","Type":"ContainerStarted","Data":"ff92211852432bc51fb869a184777a9e8aa1cb92f3331f900b7c6d97fee53f0b"} Mar 20 08:47:02 crc kubenswrapper[4903]: I0320 08:47:02.691594 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49906cb0-968b-403e-be6c-8c70d19149b1","Type":"ContainerStarted","Data":"dd8f5e06e7ee4c7922033c38fff6c1d798cc17941930fb8acc3887e1ed6ea8e4"} Mar 20 08:47:02 crc kubenswrapper[4903]: I0320 08:47:02.695725 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 08:47:02 crc kubenswrapper[4903]: I0320 08:47:02.721630 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9397954840000002 podStartE2EDuration="7.721605667s" podCreationTimestamp="2026-03-20 08:46:55 +0000 UTC" firstStartedPulling="2026-03-20 08:46:56.671831516 +0000 UTC m=+1441.888731841" lastFinishedPulling="2026-03-20 08:47:01.453641699 +0000 UTC m=+1446.670542024" observedRunningTime="2026-03-20 08:47:02.719280513 +0000 UTC m=+1447.936180848" watchObservedRunningTime="2026-03-20 08:47:02.721605667 +0000 UTC m=+1447.938506002" Mar 20 08:47:04 crc kubenswrapper[4903]: I0320 08:47:04.720151 4903 generic.go:334] "Generic (PLEG): container finished" podID="4bb8fa20-aceb-4b12-b104-e1594993b20c" containerID="dc0246b5033cba2412fe3761cbb94c3af09040b7c4ae5b202375dede12eaaf53" exitCode=0 Mar 20 08:47:04 crc kubenswrapper[4903]: I0320 08:47:04.720218 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sxx4b" event={"ID":"4bb8fa20-aceb-4b12-b104-e1594993b20c","Type":"ContainerDied","Data":"dc0246b5033cba2412fe3761cbb94c3af09040b7c4ae5b202375dede12eaaf53"} Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.223541 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.340253 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-scripts\") pod \"4bb8fa20-aceb-4b12-b104-e1594993b20c\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.340373 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-config-data\") pod \"4bb8fa20-aceb-4b12-b104-e1594993b20c\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.340429 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmmk5\" (UniqueName: \"kubernetes.io/projected/4bb8fa20-aceb-4b12-b104-e1594993b20c-kube-api-access-rmmk5\") pod \"4bb8fa20-aceb-4b12-b104-e1594993b20c\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.340549 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-combined-ca-bundle\") pod \"4bb8fa20-aceb-4b12-b104-e1594993b20c\" (UID: \"4bb8fa20-aceb-4b12-b104-e1594993b20c\") " Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.350556 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb8fa20-aceb-4b12-b104-e1594993b20c-kube-api-access-rmmk5" (OuterVolumeSpecName: "kube-api-access-rmmk5") pod "4bb8fa20-aceb-4b12-b104-e1594993b20c" (UID: "4bb8fa20-aceb-4b12-b104-e1594993b20c"). InnerVolumeSpecName "kube-api-access-rmmk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.358513 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-scripts" (OuterVolumeSpecName: "scripts") pod "4bb8fa20-aceb-4b12-b104-e1594993b20c" (UID: "4bb8fa20-aceb-4b12-b104-e1594993b20c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.374105 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bb8fa20-aceb-4b12-b104-e1594993b20c" (UID: "4bb8fa20-aceb-4b12-b104-e1594993b20c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.379149 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-config-data" (OuterVolumeSpecName: "config-data") pod "4bb8fa20-aceb-4b12-b104-e1594993b20c" (UID: "4bb8fa20-aceb-4b12-b104-e1594993b20c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.442960 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.443001 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.443013 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb8fa20-aceb-4b12-b104-e1594993b20c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.443024 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmmk5\" (UniqueName: \"kubernetes.io/projected/4bb8fa20-aceb-4b12-b104-e1594993b20c-kube-api-access-rmmk5\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.744511 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sxx4b" event={"ID":"4bb8fa20-aceb-4b12-b104-e1594993b20c","Type":"ContainerDied","Data":"c8c293b6abe5051bd8f0560c106394eacc6eb88a990972b00e89e128924e2787"} Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.744555 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8c293b6abe5051bd8f0560c106394eacc6eb88a990972b00e89e128924e2787" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.744665 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sxx4b" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.859893 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:47:06 crc kubenswrapper[4903]: E0320 08:47:06.860961 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb8fa20-aceb-4b12-b104-e1594993b20c" containerName="nova-cell0-conductor-db-sync" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.861094 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb8fa20-aceb-4b12-b104-e1594993b20c" containerName="nova-cell0-conductor-db-sync" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.861789 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb8fa20-aceb-4b12-b104-e1594993b20c" containerName="nova-cell0-conductor-db-sync" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.868723 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.871980 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.875244 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7wjch" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.902949 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.952272 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.952377 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjxr\" (UniqueName: \"kubernetes.io/projected/7a0db460-181b-48cb-84dc-d4996e2280c2-kube-api-access-5cjxr\") pod \"nova-cell0-conductor-0\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:06 crc kubenswrapper[4903]: I0320 08:47:06.952408 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:07 crc kubenswrapper[4903]: I0320 08:47:07.053973 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:07 crc kubenswrapper[4903]: I0320 08:47:07.054122 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjxr\" (UniqueName: \"kubernetes.io/projected/7a0db460-181b-48cb-84dc-d4996e2280c2-kube-api-access-5cjxr\") pod \"nova-cell0-conductor-0\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:07 crc kubenswrapper[4903]: I0320 08:47:07.054170 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:07 crc kubenswrapper[4903]: I0320 08:47:07.060223 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:07 crc kubenswrapper[4903]: I0320 08:47:07.068092 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:07 crc kubenswrapper[4903]: I0320 08:47:07.071898 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjxr\" (UniqueName: \"kubernetes.io/projected/7a0db460-181b-48cb-84dc-d4996e2280c2-kube-api-access-5cjxr\") pod \"nova-cell0-conductor-0\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:07 crc kubenswrapper[4903]: I0320 08:47:07.205532 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:07 crc kubenswrapper[4903]: I0320 08:47:07.550138 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:47:07 crc kubenswrapper[4903]: I0320 08:47:07.703934 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:47:07 crc kubenswrapper[4903]: I0320 08:47:07.825168 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7a0db460-181b-48cb-84dc-d4996e2280c2","Type":"ContainerStarted","Data":"97bbb38faa60be570fa67bf86bca83d765e69605fffb975aa303b718ccc2191f"} Mar 20 08:47:08 crc kubenswrapper[4903]: I0320 08:47:08.847426 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7a0db460-181b-48cb-84dc-d4996e2280c2","Type":"ContainerStarted","Data":"06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55"} Mar 20 08:47:08 crc kubenswrapper[4903]: I0320 08:47:08.848082 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="7a0db460-181b-48cb-84dc-d4996e2280c2" containerName="nova-cell0-conductor-conductor" containerID="cri-o://06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" gracePeriod=30 Mar 20 08:47:08 crc kubenswrapper[4903]: I0320 08:47:08.848489 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:09 crc kubenswrapper[4903]: I0320 08:47:09.460721 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.460681368 podStartE2EDuration="3.460681368s" podCreationTimestamp="2026-03-20 08:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:47:08.872127252 +0000 UTC m=+1454.089027757" watchObservedRunningTime="2026-03-20 08:47:09.460681368 +0000 UTC m=+1454.677581693" Mar 20 08:47:09 crc kubenswrapper[4903]: I0320 08:47:09.472909 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:47:09 crc kubenswrapper[4903]: I0320 08:47:09.475276 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="ceilometer-central-agent" containerID="cri-o://fe2b934d226f8b0c4f2171f97f098b7e54c2fbb0b516781ded59427787e1f5a1" gracePeriod=30 Mar 20 08:47:09 crc kubenswrapper[4903]: I0320 08:47:09.475613 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="sg-core" containerID="cri-o://ff92211852432bc51fb869a184777a9e8aa1cb92f3331f900b7c6d97fee53f0b" gracePeriod=30 Mar 20 08:47:09 crc kubenswrapper[4903]: I0320 08:47:09.475924 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="ceilometer-notification-agent" containerID="cri-o://f4ccdb0daad5518ec16ebef714bc1a3c7def637458eb65ff1b27be9e6647d62d" gracePeriod=30 Mar 20 08:47:09 crc kubenswrapper[4903]: I0320 08:47:09.475931 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="proxy-httpd" containerID="cri-o://dd8f5e06e7ee4c7922033c38fff6c1d798cc17941930fb8acc3887e1ed6ea8e4" gracePeriod=30 Mar 20 08:47:09 crc kubenswrapper[4903]: I0320 08:47:09.860389 4903 generic.go:334] "Generic (PLEG): container finished" podID="49906cb0-968b-403e-be6c-8c70d19149b1" containerID="dd8f5e06e7ee4c7922033c38fff6c1d798cc17941930fb8acc3887e1ed6ea8e4" exitCode=0 Mar 20 08:47:09 crc kubenswrapper[4903]: I0320 08:47:09.860455 4903 generic.go:334] "Generic (PLEG): container finished" podID="49906cb0-968b-403e-be6c-8c70d19149b1" containerID="ff92211852432bc51fb869a184777a9e8aa1cb92f3331f900b7c6d97fee53f0b" exitCode=2 Mar 20 08:47:09 crc kubenswrapper[4903]: I0320 08:47:09.860451 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49906cb0-968b-403e-be6c-8c70d19149b1","Type":"ContainerDied","Data":"dd8f5e06e7ee4c7922033c38fff6c1d798cc17941930fb8acc3887e1ed6ea8e4"} Mar 20 08:47:09 crc kubenswrapper[4903]: I0320 08:47:09.860585 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49906cb0-968b-403e-be6c-8c70d19149b1","Type":"ContainerDied","Data":"ff92211852432bc51fb869a184777a9e8aa1cb92f3331f900b7c6d97fee53f0b"} Mar 20 08:47:10 crc kubenswrapper[4903]: I0320 08:47:10.877448 4903 generic.go:334] "Generic (PLEG): container finished" podID="49906cb0-968b-403e-be6c-8c70d19149b1" containerID="fe2b934d226f8b0c4f2171f97f098b7e54c2fbb0b516781ded59427787e1f5a1" exitCode=0 Mar 20 08:47:10 crc kubenswrapper[4903]: I0320 08:47:10.877552 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49906cb0-968b-403e-be6c-8c70d19149b1","Type":"ContainerDied","Data":"fe2b934d226f8b0c4f2171f97f098b7e54c2fbb0b516781ded59427787e1f5a1"} Mar 20 08:47:11 crc kubenswrapper[4903]: I0320 08:47:11.894424 4903 generic.go:334] "Generic (PLEG): container finished" podID="49906cb0-968b-403e-be6c-8c70d19149b1" containerID="f4ccdb0daad5518ec16ebef714bc1a3c7def637458eb65ff1b27be9e6647d62d" exitCode=0 Mar 20 08:47:11 crc kubenswrapper[4903]: I0320 08:47:11.894499 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49906cb0-968b-403e-be6c-8c70d19149b1","Type":"ContainerDied","Data":"f4ccdb0daad5518ec16ebef714bc1a3c7def637458eb65ff1b27be9e6647d62d"} Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.186146 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.288794 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-scripts\") pod \"49906cb0-968b-403e-be6c-8c70d19149b1\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.288908 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-ceilometer-tls-certs\") pod \"49906cb0-968b-403e-be6c-8c70d19149b1\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.289093 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-run-httpd\") pod \"49906cb0-968b-403e-be6c-8c70d19149b1\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.289179 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-log-httpd\") pod \"49906cb0-968b-403e-be6c-8c70d19149b1\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.289315 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-combined-ca-bundle\") pod \"49906cb0-968b-403e-be6c-8c70d19149b1\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.289393 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mql9b\" (UniqueName: \"kubernetes.io/projected/49906cb0-968b-403e-be6c-8c70d19149b1-kube-api-access-mql9b\") pod \"49906cb0-968b-403e-be6c-8c70d19149b1\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.289519 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-sg-core-conf-yaml\") pod \"49906cb0-968b-403e-be6c-8c70d19149b1\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.289690 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-config-data\") pod \"49906cb0-968b-403e-be6c-8c70d19149b1\" (UID: \"49906cb0-968b-403e-be6c-8c70d19149b1\") " Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.291117 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "49906cb0-968b-403e-be6c-8c70d19149b1" (UID: "49906cb0-968b-403e-be6c-8c70d19149b1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.291262 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "49906cb0-968b-403e-be6c-8c70d19149b1" (UID: "49906cb0-968b-403e-be6c-8c70d19149b1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.297312 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49906cb0-968b-403e-be6c-8c70d19149b1-kube-api-access-mql9b" (OuterVolumeSpecName: "kube-api-access-mql9b") pod "49906cb0-968b-403e-be6c-8c70d19149b1" (UID: "49906cb0-968b-403e-be6c-8c70d19149b1"). InnerVolumeSpecName "kube-api-access-mql9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.298343 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-scripts" (OuterVolumeSpecName: "scripts") pod "49906cb0-968b-403e-be6c-8c70d19149b1" (UID: "49906cb0-968b-403e-be6c-8c70d19149b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.333153 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "49906cb0-968b-403e-be6c-8c70d19149b1" (UID: "49906cb0-968b-403e-be6c-8c70d19149b1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.378979 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "49906cb0-968b-403e-be6c-8c70d19149b1" (UID: "49906cb0-968b-403e-be6c-8c70d19149b1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.391849 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.391885 4903 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.391897 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.391905 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49906cb0-968b-403e-be6c-8c70d19149b1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.391968 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mql9b\" (UniqueName: \"kubernetes.io/projected/49906cb0-968b-403e-be6c-8c70d19149b1-kube-api-access-mql9b\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.391981 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.411141 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49906cb0-968b-403e-be6c-8c70d19149b1" (UID: "49906cb0-968b-403e-be6c-8c70d19149b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.425600 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-config-data" (OuterVolumeSpecName: "config-data") pod "49906cb0-968b-403e-be6c-8c70d19149b1" (UID: "49906cb0-968b-403e-be6c-8c70d19149b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.494284 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.494338 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49906cb0-968b-403e-be6c-8c70d19149b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.905380 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49906cb0-968b-403e-be6c-8c70d19149b1","Type":"ContainerDied","Data":"be836dcefa03542138421b21f4075bc18f7f7843aeb89968f1ab3e6322310ef5"} Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.905660 4903 scope.go:117] "RemoveContainer" containerID="dd8f5e06e7ee4c7922033c38fff6c1d798cc17941930fb8acc3887e1ed6ea8e4" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.905469 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.926439 4903 scope.go:117] "RemoveContainer" containerID="ff92211852432bc51fb869a184777a9e8aa1cb92f3331f900b7c6d97fee53f0b" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.939296 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.945866 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.949560 4903 scope.go:117] "RemoveContainer" containerID="f4ccdb0daad5518ec16ebef714bc1a3c7def637458eb65ff1b27be9e6647d62d" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.969839 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:47:12 crc kubenswrapper[4903]: E0320 08:47:12.970315 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="ceilometer-central-agent" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.970346 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="ceilometer-central-agent" Mar 20 08:47:12 crc kubenswrapper[4903]: E0320 08:47:12.970363 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="ceilometer-notification-agent" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.970372 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="ceilometer-notification-agent" Mar 20 08:47:12 crc kubenswrapper[4903]: E0320 08:47:12.970391 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="proxy-httpd" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.970400 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="proxy-httpd" Mar 20 08:47:12 crc kubenswrapper[4903]: E0320 08:47:12.970460 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="sg-core" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.970470 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="sg-core" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.970729 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="proxy-httpd" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.970753 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="ceilometer-central-agent" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.970771 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="ceilometer-notification-agent" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.970785 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" containerName="sg-core" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.983926 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.988749 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.989011 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 08:47:12 crc kubenswrapper[4903]: I0320 08:47:12.989255 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.016709 4903 scope.go:117] "RemoveContainer" containerID="fe2b934d226f8b0c4f2171f97f098b7e54c2fbb0b516781ded59427787e1f5a1" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.022002 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.124214 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-run-httpd\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.124291 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.124335 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-scripts\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.124441 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-log-httpd\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.124484 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.124538 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-config-data\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.124565 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.124686 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwcdc\" (UniqueName: \"kubernetes.io/projected/5328486f-b5ae-4da3-85f6-b70555303408-kube-api-access-wwcdc\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.226193 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-log-httpd\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.226287 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.226367 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-config-data\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.226402 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.226534 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwcdc\" (UniqueName: \"kubernetes.io/projected/5328486f-b5ae-4da3-85f6-b70555303408-kube-api-access-wwcdc\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.226574 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-run-httpd\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.226618 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.226648 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-scripts\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.227984 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-log-httpd\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.228795 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-run-httpd\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.233201 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.236219 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.236719 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-scripts\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.237107 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.247561 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-config-data\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.249603 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwcdc\" (UniqueName: \"kubernetes.io/projected/5328486f-b5ae-4da3-85f6-b70555303408-kube-api-access-wwcdc\") pod \"ceilometer-0\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.315336 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.514857 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49906cb0-968b-403e-be6c-8c70d19149b1" path="/var/lib/kubelet/pods/49906cb0-968b-403e-be6c-8c70d19149b1/volumes" Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.849885 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:47:13 crc kubenswrapper[4903]: W0320 08:47:13.853819 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5328486f_b5ae_4da3_85f6_b70555303408.slice/crio-b6f231603f05e477332c08bb45403cdf0cc95348973c80d198c9af0e061fd9cd WatchSource:0}: Error finding container b6f231603f05e477332c08bb45403cdf0cc95348973c80d198c9af0e061fd9cd: Status 404 returned error can't find the container with id b6f231603f05e477332c08bb45403cdf0cc95348973c80d198c9af0e061fd9cd Mar 20 08:47:13 crc kubenswrapper[4903]: I0320 08:47:13.919401 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5328486f-b5ae-4da3-85f6-b70555303408","Type":"ContainerStarted","Data":"b6f231603f05e477332c08bb45403cdf0cc95348973c80d198c9af0e061fd9cd"} Mar 20 08:47:14 crc kubenswrapper[4903]: I0320 08:47:14.933959 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5328486f-b5ae-4da3-85f6-b70555303408","Type":"ContainerStarted","Data":"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275"} Mar 20 08:47:15 crc kubenswrapper[4903]: I0320 08:47:15.953557 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5328486f-b5ae-4da3-85f6-b70555303408","Type":"ContainerStarted","Data":"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e"} Mar 20 08:47:16 crc kubenswrapper[4903]: I0320 08:47:16.968785 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5328486f-b5ae-4da3-85f6-b70555303408","Type":"ContainerStarted","Data":"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65"} Mar 20 08:47:17 crc kubenswrapper[4903]: E0320 08:47:17.208370 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:17 crc kubenswrapper[4903]: E0320 08:47:17.210287 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:17 crc kubenswrapper[4903]: E0320 08:47:17.211786 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:17 crc kubenswrapper[4903]: E0320 08:47:17.211833 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7a0db460-181b-48cb-84dc-d4996e2280c2" containerName="nova-cell0-conductor-conductor" Mar 20 08:47:20 crc kubenswrapper[4903]: I0320 08:47:20.001556 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5328486f-b5ae-4da3-85f6-b70555303408","Type":"ContainerStarted","Data":"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43"} Mar 20 08:47:20 crc kubenswrapper[4903]: I0320 08:47:20.002195 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 08:47:20 crc kubenswrapper[4903]: I0320 08:47:20.025503 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.055474901 podStartE2EDuration="8.025478912s" podCreationTimestamp="2026-03-20 08:47:12 +0000 UTC" firstStartedPulling="2026-03-20 08:47:13.856185391 +0000 UTC m=+1459.073085706" lastFinishedPulling="2026-03-20 08:47:18.826189402 +0000 UTC m=+1464.043089717" observedRunningTime="2026-03-20 08:47:20.02235285 +0000 UTC m=+1465.239253195" watchObservedRunningTime="2026-03-20 08:47:20.025478912 +0000 UTC m=+1465.242379257" Mar 20 08:47:22 crc kubenswrapper[4903]: E0320 08:47:22.208780 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:22 crc kubenswrapper[4903]: E0320 08:47:22.211003 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:22 crc kubenswrapper[4903]: E0320 08:47:22.212317 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:22 crc kubenswrapper[4903]: E0320 08:47:22.212363 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7a0db460-181b-48cb-84dc-d4996e2280c2" containerName="nova-cell0-conductor-conductor" Mar 20 08:47:27 crc kubenswrapper[4903]: E0320 08:47:27.208919 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:27 crc kubenswrapper[4903]: E0320 08:47:27.211455 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:27 crc kubenswrapper[4903]: E0320 08:47:27.214544 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:27 crc kubenswrapper[4903]: E0320 08:47:27.214588 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7a0db460-181b-48cb-84dc-d4996e2280c2" containerName="nova-cell0-conductor-conductor" Mar 20 08:47:32 crc kubenswrapper[4903]: E0320 08:47:32.209486 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:32 crc kubenswrapper[4903]: E0320 08:47:32.212157 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:32 crc kubenswrapper[4903]: E0320 08:47:32.214274 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:32 crc kubenswrapper[4903]: E0320 08:47:32.214378 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7a0db460-181b-48cb-84dc-d4996e2280c2" containerName="nova-cell0-conductor-conductor" Mar 20 08:47:37 crc kubenswrapper[4903]: E0320 08:47:37.209235 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:37 crc kubenswrapper[4903]: E0320 08:47:37.212227 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:37 crc kubenswrapper[4903]: E0320 08:47:37.214008 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:47:37 crc kubenswrapper[4903]: E0320 08:47:37.214083 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7a0db460-181b-48cb-84dc-d4996e2280c2" containerName="nova-cell0-conductor-conductor" Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.261518 4903 generic.go:334] "Generic (PLEG): container finished" podID="7a0db460-181b-48cb-84dc-d4996e2280c2" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" exitCode=137 Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.261836 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7a0db460-181b-48cb-84dc-d4996e2280c2","Type":"ContainerDied","Data":"06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55"} Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.336784 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.544516 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-combined-ca-bundle\") pod \"7a0db460-181b-48cb-84dc-d4996e2280c2\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.544624 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-config-data\") pod \"7a0db460-181b-48cb-84dc-d4996e2280c2\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.544713 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cjxr\" (UniqueName: \"kubernetes.io/projected/7a0db460-181b-48cb-84dc-d4996e2280c2-kube-api-access-5cjxr\") pod \"7a0db460-181b-48cb-84dc-d4996e2280c2\" (UID: \"7a0db460-181b-48cb-84dc-d4996e2280c2\") " Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.571189 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0db460-181b-48cb-84dc-d4996e2280c2-kube-api-access-5cjxr" (OuterVolumeSpecName: "kube-api-access-5cjxr") pod "7a0db460-181b-48cb-84dc-d4996e2280c2" (UID: "7a0db460-181b-48cb-84dc-d4996e2280c2"). InnerVolumeSpecName "kube-api-access-5cjxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.584478 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-config-data" (OuterVolumeSpecName: "config-data") pod "7a0db460-181b-48cb-84dc-d4996e2280c2" (UID: "7a0db460-181b-48cb-84dc-d4996e2280c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.601884 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a0db460-181b-48cb-84dc-d4996e2280c2" (UID: "7a0db460-181b-48cb-84dc-d4996e2280c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.649095 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cjxr\" (UniqueName: \"kubernetes.io/projected/7a0db460-181b-48cb-84dc-d4996e2280c2-kube-api-access-5cjxr\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.649333 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:39 crc kubenswrapper[4903]: I0320 08:47:39.649434 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0db460-181b-48cb-84dc-d4996e2280c2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.279184 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7a0db460-181b-48cb-84dc-d4996e2280c2","Type":"ContainerDied","Data":"97bbb38faa60be570fa67bf86bca83d765e69605fffb975aa303b718ccc2191f"} Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.279503 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.279904 4903 scope.go:117] "RemoveContainer" containerID="06b7264263701089ff0811731615d7de0fc9a0fec45a45ab7f66ed14a32d1e55" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.341773 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.362074 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.376636 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:47:40 crc kubenswrapper[4903]: E0320 08:47:40.377208 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0db460-181b-48cb-84dc-d4996e2280c2" containerName="nova-cell0-conductor-conductor" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.377243 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0db460-181b-48cb-84dc-d4996e2280c2" containerName="nova-cell0-conductor-conductor" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.377589 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0db460-181b-48cb-84dc-d4996e2280c2" containerName="nova-cell0-conductor-conductor" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.378541 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.382471 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7wjch" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.400855 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.402379 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.570180 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.570728 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.570842 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntdfg\" (UniqueName: \"kubernetes.io/projected/d570ab6f-6c5f-4255-b2ae-1966da262a0d-kube-api-access-ntdfg\") pod \"nova-cell0-conductor-0\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.673229 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.673321 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntdfg\" (UniqueName: \"kubernetes.io/projected/d570ab6f-6c5f-4255-b2ae-1966da262a0d-kube-api-access-ntdfg\") pod \"nova-cell0-conductor-0\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.673612 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.680800 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.681863 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.697575 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntdfg\" (UniqueName: \"kubernetes.io/projected/d570ab6f-6c5f-4255-b2ae-1966da262a0d-kube-api-access-ntdfg\") pod \"nova-cell0-conductor-0\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:40 crc kubenswrapper[4903]: I0320 08:47:40.717803 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:41 crc kubenswrapper[4903]: I0320 08:47:41.240510 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:47:41 crc kubenswrapper[4903]: I0320 08:47:41.295577 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d570ab6f-6c5f-4255-b2ae-1966da262a0d","Type":"ContainerStarted","Data":"3b32e8e9be55850eb9afeb0b7dffe3030dac267dd2f1642dc8a6c6337513227e"} Mar 20 08:47:41 crc kubenswrapper[4903]: I0320 08:47:41.508990 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0db460-181b-48cb-84dc-d4996e2280c2" path="/var/lib/kubelet/pods/7a0db460-181b-48cb-84dc-d4996e2280c2/volumes" Mar 20 08:47:42 crc kubenswrapper[4903]: I0320 08:47:42.311351 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d570ab6f-6c5f-4255-b2ae-1966da262a0d","Type":"ContainerStarted","Data":"bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce"} Mar 20 08:47:42 crc kubenswrapper[4903]: I0320 08:47:42.311938 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:42 crc kubenswrapper[4903]: I0320 08:47:42.350283 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.350256695 podStartE2EDuration="2.350256695s" podCreationTimestamp="2026-03-20 08:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:47:42.338698185 +0000 UTC m=+1487.555598540" watchObservedRunningTime="2026-03-20 08:47:42.350256695 +0000 UTC m=+1487.567157010" Mar 20 08:47:43 crc kubenswrapper[4903]: I0320 08:47:43.336173 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 08:47:50 crc kubenswrapper[4903]: I0320 08:47:50.748247 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.279189 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-72j48"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.299972 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.304008 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.304212 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.316081 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-72j48"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.392900 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-scripts\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.393375 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.393420 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-config-data\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.393497 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g954z\" (UniqueName: \"kubernetes.io/projected/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-kube-api-access-g954z\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.443947 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.445420 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.450020 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.474779 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.494457 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.494498 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-config-data\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.494550 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g954z\" (UniqueName: \"kubernetes.io/projected/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-kube-api-access-g954z\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.494584 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-scripts\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.510715 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-scripts\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.510870 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-config-data\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.519984 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g954z\" (UniqueName: \"kubernetes.io/projected/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-kube-api-access-g954z\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.525945 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-72j48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.543804 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.544751 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.544834 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.558253 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.596353 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.596397 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-logs\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.596461 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-config-data\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.596591 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwmqh\" (UniqueName: \"kubernetes.io/projected/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-kube-api-access-wwmqh\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.640640 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.685113 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.686884 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.697486 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.699200 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.699234 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-logs\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.699275 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-config-data\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.699307 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.699336 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.699381 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwmqh\" (UniqueName: \"kubernetes.io/projected/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-kube-api-access-wwmqh\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.699417 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v5t4\" (UniqueName: \"kubernetes.io/projected/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-kube-api-access-9v5t4\") pod \"nova-cell1-novncproxy-0\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.699939 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-logs\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.704612 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-config-data\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.714244 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.714918 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.750087 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.751751 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.760385 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwmqh\" (UniqueName: \"kubernetes.io/projected/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-kube-api-access-wwmqh\") pod \"nova-api-0\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.761401 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.782221 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.800969 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-config-data\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.801073 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.801104 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.801141 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbklx\" (UniqueName: \"kubernetes.io/projected/6e5503e6-7030-40fa-b8f5-88476868ba0d-kube-api-access-dbklx\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.801166 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e5503e6-7030-40fa-b8f5-88476868ba0d-logs\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.801187 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.801219 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v5t4\" (UniqueName: \"kubernetes.io/projected/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-kube-api-access-9v5t4\") pod \"nova-cell1-novncproxy-0\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.804622 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.813693 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.828459 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.864866 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v5t4\" (UniqueName: \"kubernetes.io/projected/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-kube-api-access-9v5t4\") pod \"nova-cell1-novncproxy-0\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.883271 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-lrmtk"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.884850 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.894887 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-lrmtk"] Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.908803 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww58l\" (UniqueName: \"kubernetes.io/projected/49e6a095-3c2f-424c-9bd5-3b59e58550ea-kube-api-access-ww58l\") pod \"nova-scheduler-0\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " pod="openstack/nova-scheduler-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.908928 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbklx\" (UniqueName: \"kubernetes.io/projected/6e5503e6-7030-40fa-b8f5-88476868ba0d-kube-api-access-dbklx\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.908962 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e5503e6-7030-40fa-b8f5-88476868ba0d-logs\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.908988 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.909047 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-config-data\") pod \"nova-scheduler-0\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " pod="openstack/nova-scheduler-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.909120 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-config-data\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.909163 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " pod="openstack/nova-scheduler-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.909992 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e5503e6-7030-40fa-b8f5-88476868ba0d-logs\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.915093 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.934962 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.960275 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-config-data\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:51 crc kubenswrapper[4903]: I0320 08:47:51.969368 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbklx\" (UniqueName: \"kubernetes.io/projected/6e5503e6-7030-40fa-b8f5-88476868ba0d-kube-api-access-dbklx\") pod \"nova-metadata-0\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " pod="openstack/nova-metadata-0" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.012779 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-config-data\") pod \"nova-scheduler-0\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " pod="openstack/nova-scheduler-0" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.012871 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-config\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.012902 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.012963 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " pod="openstack/nova-scheduler-0" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.012995 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn5gt\" (UniqueName: \"kubernetes.io/projected/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-kube-api-access-cn5gt\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.013049 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww58l\" (UniqueName: \"kubernetes.io/projected/49e6a095-3c2f-424c-9bd5-3b59e58550ea-kube-api-access-ww58l\") pod \"nova-scheduler-0\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " pod="openstack/nova-scheduler-0" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.013075 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.014658 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-svc\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.014691 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.029177 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " pod="openstack/nova-scheduler-0" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.043864 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-config-data\") pod \"nova-scheduler-0\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " pod="openstack/nova-scheduler-0" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.046740 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww58l\" (UniqueName: \"kubernetes.io/projected/49e6a095-3c2f-424c-9bd5-3b59e58550ea-kube-api-access-ww58l\") pod \"nova-scheduler-0\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " pod="openstack/nova-scheduler-0" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.116862 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn5gt\" (UniqueName: \"kubernetes.io/projected/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-kube-api-access-cn5gt\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.116909 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.116954 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-svc\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.116978 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.117091 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-config\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.117111 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.117920 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.119099 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-svc\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.119583 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.119769 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.120067 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-config\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.139511 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.146126 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn5gt\" (UniqueName: \"kubernetes.io/projected/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-kube-api-access-cn5gt\") pod \"dnsmasq-dns-757b4f8459-lrmtk\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.168803 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.234627 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.469572 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.514952 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.560121 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-72j48"] Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.681143 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4bdn7"] Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.686464 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.689299 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.689652 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.718211 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4bdn7"] Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.741025 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:47:52 crc kubenswrapper[4903]: W0320 08:47:52.753612 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbeea61a_87b9_4a74_ab2b_78e4eceff3e4.slice/crio-f97e9ab87926808ccd7d4f5e30afa73f826b84b5e0c50a9657f7aff6ba86d2e4 WatchSource:0}: Error finding container f97e9ab87926808ccd7d4f5e30afa73f826b84b5e0c50a9657f7aff6ba86d2e4: Status 404 returned error can't find the container with id f97e9ab87926808ccd7d4f5e30afa73f826b84b5e0c50a9657f7aff6ba86d2e4 Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.814960 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.846847 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntkrh\" (UniqueName: \"kubernetes.io/projected/bfaa735a-cdf6-40c4-ab7f-42605e13127c-kube-api-access-ntkrh\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.848198 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-scripts\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.848475 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.848534 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-config-data\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: W0320 08:47:52.872947 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49e6a095_3c2f_424c_9bd5_3b59e58550ea.slice/crio-fd9263165ffef2ed52947676f95a8683d4b64bb577b75a55fb92319127e4d603 WatchSource:0}: Error finding container fd9263165ffef2ed52947676f95a8683d4b64bb577b75a55fb92319127e4d603: Status 404 returned error can't find the container with id fd9263165ffef2ed52947676f95a8683d4b64bb577b75a55fb92319127e4d603 Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.882247 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.891611 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-lrmtk"] Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.950730 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntkrh\" (UniqueName: \"kubernetes.io/projected/bfaa735a-cdf6-40c4-ab7f-42605e13127c-kube-api-access-ntkrh\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.950833 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-scripts\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.950895 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.950918 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-config-data\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.955908 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-config-data\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.960884 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-scripts\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.961649 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:52 crc kubenswrapper[4903]: I0320 08:47:52.969694 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntkrh\" (UniqueName: \"kubernetes.io/projected/bfaa735a-cdf6-40c4-ab7f-42605e13127c-kube-api-access-ntkrh\") pod \"nova-cell1-conductor-db-sync-4bdn7\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.126469 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.489162 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-72j48" event={"ID":"7ab050e7-f7f7-4a4a-ab49-b2601b269b48","Type":"ContainerStarted","Data":"ee90dea62199fe22bb435b923f62eb94b35ac85058152fc088a3cfbcfbe6f805"} Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.489560 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-72j48" event={"ID":"7ab050e7-f7f7-4a4a-ab49-b2601b269b48","Type":"ContainerStarted","Data":"1a358934a556be5fc0711f183bf93cd8b9a1de2469d08dcda0239d0afbf63abd"} Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.519392 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82","Type":"ContainerStarted","Data":"86f51b773891445939dfe718ff1ba6015122b275456c6cdab0e3e719734c16fe"} Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.519444 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49e6a095-3c2f-424c-9bd5-3b59e58550ea","Type":"ContainerStarted","Data":"fd9263165ffef2ed52947676f95a8683d4b64bb577b75a55fb92319127e4d603"} Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.519467 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4","Type":"ContainerStarted","Data":"f97e9ab87926808ccd7d4f5e30afa73f826b84b5e0c50a9657f7aff6ba86d2e4"} Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.519893 4903 generic.go:334] "Generic (PLEG): container finished" podID="3610fd9b-fd0a-4b22-8a26-27a393bf92a6" containerID="f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426" exitCode=0 Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.519994 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" event={"ID":"3610fd9b-fd0a-4b22-8a26-27a393bf92a6","Type":"ContainerDied","Data":"f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426"} Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.520109 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" event={"ID":"3610fd9b-fd0a-4b22-8a26-27a393bf92a6","Type":"ContainerStarted","Data":"00784d5ee33314c79f8dbef53df595dea5427963b8de41b191ff04c375b7f027"} Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.523222 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e5503e6-7030-40fa-b8f5-88476868ba0d","Type":"ContainerStarted","Data":"150f7e8fd339b7e30774738595f07278cbd963a7a2db3827f0fc5c95f4f569f0"} Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.545407 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-72j48" podStartSLOduration=2.545388549 podStartE2EDuration="2.545388549s" podCreationTimestamp="2026-03-20 08:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:47:53.517449894 +0000 UTC m=+1498.734350209" watchObservedRunningTime="2026-03-20 08:47:53.545388549 +0000 UTC m=+1498.762288864" Mar 20 08:47:53 crc kubenswrapper[4903]: I0320 08:47:53.627019 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4bdn7"] Mar 20 08:47:54 crc kubenswrapper[4903]: I0320 08:47:54.546278 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4bdn7" event={"ID":"bfaa735a-cdf6-40c4-ab7f-42605e13127c","Type":"ContainerStarted","Data":"d66dd250aac30527302661a2ae22a91d93dd0d6f8cc19055dd8957b946b6d63d"} Mar 20 08:47:54 crc kubenswrapper[4903]: I0320 08:47:54.547434 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4bdn7" event={"ID":"bfaa735a-cdf6-40c4-ab7f-42605e13127c","Type":"ContainerStarted","Data":"e83a1c36cc7839da0c98e728bad997852477c03faf1d320fb46c91348c9afad7"} Mar 20 08:47:54 crc kubenswrapper[4903]: I0320 08:47:54.561449 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" event={"ID":"3610fd9b-fd0a-4b22-8a26-27a393bf92a6","Type":"ContainerStarted","Data":"b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c"} Mar 20 08:47:54 crc kubenswrapper[4903]: I0320 08:47:54.561901 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:47:54 crc kubenswrapper[4903]: I0320 08:47:54.579905 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4bdn7" podStartSLOduration=2.579877979 podStartE2EDuration="2.579877979s" podCreationTimestamp="2026-03-20 08:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:47:54.563493437 +0000 UTC m=+1499.780393762" watchObservedRunningTime="2026-03-20 08:47:54.579877979 +0000 UTC m=+1499.796778304" Mar 20 08:47:55 crc kubenswrapper[4903]: I0320 08:47:55.544373 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" podStartSLOduration=4.544350254 podStartE2EDuration="4.544350254s" podCreationTimestamp="2026-03-20 08:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:47:54.596913437 +0000 UTC m=+1499.813813782" watchObservedRunningTime="2026-03-20 08:47:55.544350254 +0000 UTC m=+1500.761250569" Mar 20 08:47:55 crc kubenswrapper[4903]: I0320 08:47:55.594021 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:47:55 crc kubenswrapper[4903]: I0320 08:47:55.638009 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.617919 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82","Type":"ContainerStarted","Data":"67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189"} Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.620295 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49e6a095-3c2f-424c-9bd5-3b59e58550ea","Type":"ContainerStarted","Data":"b26c794e72d66a4bd3e16c59e65cb95bd12e7b96ab3cb5149c5daa4ecf5aa5a6"} Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.632116 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4","Type":"ContainerStarted","Data":"c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230"} Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.633116 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fbeea61a-87b9-4a74-ab2b-78e4eceff3e4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230" gracePeriod=30 Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.639170 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e5503e6-7030-40fa-b8f5-88476868ba0d","Type":"ContainerStarted","Data":"08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96"} Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.639365 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6e5503e6-7030-40fa-b8f5-88476868ba0d" containerName="nova-metadata-log" containerID="cri-o://08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96" gracePeriod=30 Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.639481 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6e5503e6-7030-40fa-b8f5-88476868ba0d" containerName="nova-metadata-metadata" containerID="cri-o://e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330" gracePeriod=30 Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.657758 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.476256154 podStartE2EDuration="5.657729798s" podCreationTimestamp="2026-03-20 08:47:51 +0000 UTC" firstStartedPulling="2026-03-20 08:47:52.875764203 +0000 UTC m=+1498.092664518" lastFinishedPulling="2026-03-20 08:47:56.057237847 +0000 UTC m=+1501.274138162" observedRunningTime="2026-03-20 08:47:56.647378865 +0000 UTC m=+1501.864279180" watchObservedRunningTime="2026-03-20 08:47:56.657729798 +0000 UTC m=+1501.874630123" Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.697391 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.466665388 podStartE2EDuration="5.69736933s" podCreationTimestamp="2026-03-20 08:47:51 +0000 UTC" firstStartedPulling="2026-03-20 08:47:52.833931577 +0000 UTC m=+1498.050831892" lastFinishedPulling="2026-03-20 08:47:56.064635519 +0000 UTC m=+1501.281535834" observedRunningTime="2026-03-20 08:47:56.696007766 +0000 UTC m=+1501.912908081" watchObservedRunningTime="2026-03-20 08:47:56.69736933 +0000 UTC m=+1501.914269645" Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.699167 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.400777783 podStartE2EDuration="5.699160154s" podCreationTimestamp="2026-03-20 08:47:51 +0000 UTC" firstStartedPulling="2026-03-20 08:47:52.759265417 +0000 UTC m=+1497.976165722" lastFinishedPulling="2026-03-20 08:47:56.057647778 +0000 UTC m=+1501.274548093" observedRunningTime="2026-03-20 08:47:56.67045454 +0000 UTC m=+1501.887354865" watchObservedRunningTime="2026-03-20 08:47:56.699160154 +0000 UTC m=+1501.916060469" Mar 20 08:47:56 crc kubenswrapper[4903]: I0320 08:47:56.935789 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:47:57 crc kubenswrapper[4903]: I0320 08:47:57.170307 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 08:47:57 crc kubenswrapper[4903]: I0320 08:47:57.652380 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82","Type":"ContainerStarted","Data":"74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5"} Mar 20 08:47:57 crc kubenswrapper[4903]: I0320 08:47:57.658163 4903 generic.go:334] "Generic (PLEG): container finished" podID="6e5503e6-7030-40fa-b8f5-88476868ba0d" containerID="08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96" exitCode=143 Mar 20 08:47:57 crc kubenswrapper[4903]: I0320 08:47:57.658205 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e5503e6-7030-40fa-b8f5-88476868ba0d","Type":"ContainerStarted","Data":"e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330"} Mar 20 08:47:57 crc kubenswrapper[4903]: I0320 08:47:57.658250 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e5503e6-7030-40fa-b8f5-88476868ba0d","Type":"ContainerDied","Data":"08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96"} Mar 20 08:47:57 crc kubenswrapper[4903]: I0320 08:47:57.675672 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.126183167 podStartE2EDuration="6.675646223s" podCreationTimestamp="2026-03-20 08:47:51 +0000 UTC" firstStartedPulling="2026-03-20 08:47:52.514696201 +0000 UTC m=+1497.731596516" lastFinishedPulling="2026-03-20 08:47:56.064159267 +0000 UTC m=+1501.281059572" observedRunningTime="2026-03-20 08:47:57.670460815 +0000 UTC m=+1502.887361190" watchObservedRunningTime="2026-03-20 08:47:57.675646223 +0000 UTC m=+1502.892546538" Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.155890 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566608-982dt"] Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.158568 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566608-982dt" Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.162554 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.162781 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.163702 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.170705 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566608-982dt"] Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.249553 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cxpg\" (UniqueName: \"kubernetes.io/projected/403ddea3-182d-4078-ae29-8bf03ce54cb5-kube-api-access-5cxpg\") pod \"auto-csr-approver-29566608-982dt\" (UID: \"403ddea3-182d-4078-ae29-8bf03ce54cb5\") " pod="openshift-infra/auto-csr-approver-29566608-982dt" Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.351512 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cxpg\" (UniqueName: \"kubernetes.io/projected/403ddea3-182d-4078-ae29-8bf03ce54cb5-kube-api-access-5cxpg\") pod \"auto-csr-approver-29566608-982dt\" (UID: \"403ddea3-182d-4078-ae29-8bf03ce54cb5\") " pod="openshift-infra/auto-csr-approver-29566608-982dt" Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.380565 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cxpg\" (UniqueName: \"kubernetes.io/projected/403ddea3-182d-4078-ae29-8bf03ce54cb5-kube-api-access-5cxpg\") pod \"auto-csr-approver-29566608-982dt\" (UID: \"403ddea3-182d-4078-ae29-8bf03ce54cb5\") " pod="openshift-infra/auto-csr-approver-29566608-982dt" Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.482662 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566608-982dt" Mar 20 08:48:00 crc kubenswrapper[4903]: I0320 08:48:00.976814 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566608-982dt"] Mar 20 08:48:01 crc kubenswrapper[4903]: I0320 08:48:01.730596 4903 generic.go:334] "Generic (PLEG): container finished" podID="bfaa735a-cdf6-40c4-ab7f-42605e13127c" containerID="d66dd250aac30527302661a2ae22a91d93dd0d6f8cc19055dd8957b946b6d63d" exitCode=0 Mar 20 08:48:01 crc kubenswrapper[4903]: I0320 08:48:01.730727 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4bdn7" event={"ID":"bfaa735a-cdf6-40c4-ab7f-42605e13127c","Type":"ContainerDied","Data":"d66dd250aac30527302661a2ae22a91d93dd0d6f8cc19055dd8957b946b6d63d"} Mar 20 08:48:01 crc kubenswrapper[4903]: I0320 08:48:01.734238 4903 generic.go:334] "Generic (PLEG): container finished" podID="7ab050e7-f7f7-4a4a-ab49-b2601b269b48" containerID="ee90dea62199fe22bb435b923f62eb94b35ac85058152fc088a3cfbcfbe6f805" exitCode=0 Mar 20 08:48:01 crc kubenswrapper[4903]: I0320 08:48:01.734320 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-72j48" event={"ID":"7ab050e7-f7f7-4a4a-ab49-b2601b269b48","Type":"ContainerDied","Data":"ee90dea62199fe22bb435b923f62eb94b35ac85058152fc088a3cfbcfbe6f805"} Mar 20 08:48:01 crc kubenswrapper[4903]: I0320 08:48:01.737534 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566608-982dt" event={"ID":"403ddea3-182d-4078-ae29-8bf03ce54cb5","Type":"ContainerStarted","Data":"ad023f5968cf9328cd2aa1f693702748a6de0da6d8eba237227da00fa0206f6e"} Mar 20 08:48:01 crc kubenswrapper[4903]: I0320 08:48:01.785289 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:48:01 crc kubenswrapper[4903]: I0320 08:48:01.785394 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.170075 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.211870 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.236269 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.341661 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9g5kv"] Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.345675 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" podUID="6f94af42-d7a9-437c-a74b-5d63fcd63a50" containerName="dnsmasq-dns" containerID="cri-o://4a1e2756c85ed1d55a1644e0e4b58a25a84cb2a386791c2b75b8030cee67d6ac" gracePeriod=10 Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.754344 4903 generic.go:334] "Generic (PLEG): container finished" podID="6f94af42-d7a9-437c-a74b-5d63fcd63a50" containerID="4a1e2756c85ed1d55a1644e0e4b58a25a84cb2a386791c2b75b8030cee67d6ac" exitCode=0 Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.754431 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" event={"ID":"6f94af42-d7a9-437c-a74b-5d63fcd63a50","Type":"ContainerDied","Data":"4a1e2756c85ed1d55a1644e0e4b58a25a84cb2a386791c2b75b8030cee67d6ac"} Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.762350 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566608-982dt" event={"ID":"403ddea3-182d-4078-ae29-8bf03ce54cb5","Type":"ContainerStarted","Data":"b3af3565bac51cdf7cfe196ddf488821fbf4fd69b2581ca4ee2090864d020c97"} Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.784620 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566608-982dt" podStartSLOduration=1.528473547 podStartE2EDuration="2.78459421s" podCreationTimestamp="2026-03-20 08:48:00 +0000 UTC" firstStartedPulling="2026-03-20 08:48:00.990877637 +0000 UTC m=+1506.207777962" lastFinishedPulling="2026-03-20 08:48:02.24699829 +0000 UTC m=+1507.463898625" observedRunningTime="2026-03-20 08:48:02.783924804 +0000 UTC m=+1508.000825119" watchObservedRunningTime="2026-03-20 08:48:02.78459421 +0000 UTC m=+1508.001494525" Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.827755 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.868174 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.868176 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:02 crc kubenswrapper[4903]: I0320 08:48:02.970935 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.123608 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-nb\") pod \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.123939 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49828\" (UniqueName: \"kubernetes.io/projected/6f94af42-d7a9-437c-a74b-5d63fcd63a50-kube-api-access-49828\") pod \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.123962 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-swift-storage-0\") pod \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.124007 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-config\") pod \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.124105 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-svc\") pod \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.124133 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-sb\") pod \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\" (UID: \"6f94af42-d7a9-437c-a74b-5d63fcd63a50\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.138022 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9qstg"] Mar 20 08:48:03 crc kubenswrapper[4903]: E0320 08:48:03.138464 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f94af42-d7a9-437c-a74b-5d63fcd63a50" containerName="init" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.138477 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f94af42-d7a9-437c-a74b-5d63fcd63a50" containerName="init" Mar 20 08:48:03 crc kubenswrapper[4903]: E0320 08:48:03.138501 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f94af42-d7a9-437c-a74b-5d63fcd63a50" containerName="dnsmasq-dns" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.138507 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f94af42-d7a9-437c-a74b-5d63fcd63a50" containerName="dnsmasq-dns" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.138678 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f94af42-d7a9-437c-a74b-5d63fcd63a50" containerName="dnsmasq-dns" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.140139 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.143026 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f94af42-d7a9-437c-a74b-5d63fcd63a50-kube-api-access-49828" (OuterVolumeSpecName: "kube-api-access-49828") pod "6f94af42-d7a9-437c-a74b-5d63fcd63a50" (UID: "6f94af42-d7a9-437c-a74b-5d63fcd63a50"). InnerVolumeSpecName "kube-api-access-49828". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.187954 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qstg"] Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.227248 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49828\" (UniqueName: \"kubernetes.io/projected/6f94af42-d7a9-437c-a74b-5d63fcd63a50-kube-api-access-49828\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.319273 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.328614 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d574q\" (UniqueName: \"kubernetes.io/projected/1abf248d-f62b-4f0c-974c-8d724250e196-kube-api-access-d574q\") pod \"redhat-marketplace-9qstg\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.328797 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-catalog-content\") pod \"redhat-marketplace-9qstg\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.328827 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-utilities\") pod \"redhat-marketplace-9qstg\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.414913 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6f94af42-d7a9-437c-a74b-5d63fcd63a50" (UID: "6f94af42-d7a9-437c-a74b-5d63fcd63a50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.423995 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6f94af42-d7a9-437c-a74b-5d63fcd63a50" (UID: "6f94af42-d7a9-437c-a74b-5d63fcd63a50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.429880 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntkrh\" (UniqueName: \"kubernetes.io/projected/bfaa735a-cdf6-40c4-ab7f-42605e13127c-kube-api-access-ntkrh\") pod \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.430064 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-config-data\") pod \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.430156 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-scripts\") pod \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.430236 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-combined-ca-bundle\") pod \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\" (UID: \"bfaa735a-cdf6-40c4-ab7f-42605e13127c\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.430576 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-utilities\") pod \"redhat-marketplace-9qstg\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.430628 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d574q\" (UniqueName: \"kubernetes.io/projected/1abf248d-f62b-4f0c-974c-8d724250e196-kube-api-access-d574q\") pod \"redhat-marketplace-9qstg\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.431350 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-catalog-content\") pod \"redhat-marketplace-9qstg\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.431424 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.431436 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.431875 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-catalog-content\") pod \"redhat-marketplace-9qstg\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.450283 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfaa735a-cdf6-40c4-ab7f-42605e13127c-kube-api-access-ntkrh" (OuterVolumeSpecName: "kube-api-access-ntkrh") pod "bfaa735a-cdf6-40c4-ab7f-42605e13127c" (UID: "bfaa735a-cdf6-40c4-ab7f-42605e13127c"). InnerVolumeSpecName "kube-api-access-ntkrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.450398 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-utilities\") pod \"redhat-marketplace-9qstg\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.455661 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-config" (OuterVolumeSpecName: "config") pod "6f94af42-d7a9-437c-a74b-5d63fcd63a50" (UID: "6f94af42-d7a9-437c-a74b-5d63fcd63a50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.461881 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-scripts" (OuterVolumeSpecName: "scripts") pod "bfaa735a-cdf6-40c4-ab7f-42605e13127c" (UID: "bfaa735a-cdf6-40c4-ab7f-42605e13127c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.479154 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d574q\" (UniqueName: \"kubernetes.io/projected/1abf248d-f62b-4f0c-974c-8d724250e196-kube-api-access-d574q\") pod \"redhat-marketplace-9qstg\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.497600 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-config-data" (OuterVolumeSpecName: "config-data") pod "bfaa735a-cdf6-40c4-ab7f-42605e13127c" (UID: "bfaa735a-cdf6-40c4-ab7f-42605e13127c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.499881 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6f94af42-d7a9-437c-a74b-5d63fcd63a50" (UID: "6f94af42-d7a9-437c-a74b-5d63fcd63a50"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.500517 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6f94af42-d7a9-437c-a74b-5d63fcd63a50" (UID: "6f94af42-d7a9-437c-a74b-5d63fcd63a50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.513753 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.516838 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfaa735a-cdf6-40c4-ab7f-42605e13127c" (UID: "bfaa735a-cdf6-40c4-ab7f-42605e13127c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.534361 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.534390 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntkrh\" (UniqueName: \"kubernetes.io/projected/bfaa735a-cdf6-40c4-ab7f-42605e13127c-kube-api-access-ntkrh\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.534401 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.534412 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.534420 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfaa735a-cdf6-40c4-ab7f-42605e13127c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.534431 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.534441 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f94af42-d7a9-437c-a74b-5d63fcd63a50-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.602309 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.737380 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-scripts\") pod \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.737928 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-config-data\") pod \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.737978 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-combined-ca-bundle\") pod \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.738282 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g954z\" (UniqueName: \"kubernetes.io/projected/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-kube-api-access-g954z\") pod \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\" (UID: \"7ab050e7-f7f7-4a4a-ab49-b2601b269b48\") " Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.749021 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-scripts" (OuterVolumeSpecName: "scripts") pod "7ab050e7-f7f7-4a4a-ab49-b2601b269b48" (UID: "7ab050e7-f7f7-4a4a-ab49-b2601b269b48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.752332 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-kube-api-access-g954z" (OuterVolumeSpecName: "kube-api-access-g954z") pod "7ab050e7-f7f7-4a4a-ab49-b2601b269b48" (UID: "7ab050e7-f7f7-4a4a-ab49-b2601b269b48"). InnerVolumeSpecName "kube-api-access-g954z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.797673 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-config-data" (OuterVolumeSpecName: "config-data") pod "7ab050e7-f7f7-4a4a-ab49-b2601b269b48" (UID: "7ab050e7-f7f7-4a4a-ab49-b2601b269b48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.808010 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4bdn7" event={"ID":"bfaa735a-cdf6-40c4-ab7f-42605e13127c","Type":"ContainerDied","Data":"e83a1c36cc7839da0c98e728bad997852477c03faf1d320fb46c91348c9afad7"} Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.808082 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e83a1c36cc7839da0c98e728bad997852477c03faf1d320fb46c91348c9afad7" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.808189 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4bdn7" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.815374 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ab050e7-f7f7-4a4a-ab49-b2601b269b48" (UID: "7ab050e7-f7f7-4a4a-ab49-b2601b269b48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.817328 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" event={"ID":"6f94af42-d7a9-437c-a74b-5d63fcd63a50","Type":"ContainerDied","Data":"b4f7c2f895a5fe2ca3dbcc01938495afa9e17dc1301bad958c681edcc5c8a140"} Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.817406 4903 scope.go:117] "RemoveContainer" containerID="4a1e2756c85ed1d55a1644e0e4b58a25a84cb2a386791c2b75b8030cee67d6ac" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.817670 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9g5kv" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.823557 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-72j48" event={"ID":"7ab050e7-f7f7-4a4a-ab49-b2601b269b48","Type":"ContainerDied","Data":"1a358934a556be5fc0711f183bf93cd8b9a1de2469d08dcda0239d0afbf63abd"} Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.823586 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a358934a556be5fc0711f183bf93cd8b9a1de2469d08dcda0239d0afbf63abd" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.823670 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-72j48" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.841272 4903 generic.go:334] "Generic (PLEG): container finished" podID="403ddea3-182d-4078-ae29-8bf03ce54cb5" containerID="b3af3565bac51cdf7cfe196ddf488821fbf4fd69b2581ca4ee2090864d020c97" exitCode=0 Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.844842 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566608-982dt" event={"ID":"403ddea3-182d-4078-ae29-8bf03ce54cb5","Type":"ContainerDied","Data":"b3af3565bac51cdf7cfe196ddf488821fbf4fd69b2581ca4ee2090864d020c97"} Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.846881 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g954z\" (UniqueName: \"kubernetes.io/projected/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-kube-api-access-g954z\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.846910 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.846920 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.846930 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab050e7-f7f7-4a4a-ab49-b2601b269b48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.893347 4903 scope.go:117] "RemoveContainer" containerID="097edbddde97da45f7e0b48e947c34e3ca361dc532c950791e2d51a9a9cd496d" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.903192 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:48:03 crc kubenswrapper[4903]: E0320 08:48:03.911715 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfaa735a-cdf6-40c4-ab7f-42605e13127c" containerName="nova-cell1-conductor-db-sync" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.911749 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfaa735a-cdf6-40c4-ab7f-42605e13127c" containerName="nova-cell1-conductor-db-sync" Mar 20 08:48:03 crc kubenswrapper[4903]: E0320 08:48:03.911762 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab050e7-f7f7-4a4a-ab49-b2601b269b48" containerName="nova-manage" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.911768 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab050e7-f7f7-4a4a-ab49-b2601b269b48" containerName="nova-manage" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.929188 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab050e7-f7f7-4a4a-ab49-b2601b269b48" containerName="nova-manage" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.929258 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfaa735a-cdf6-40c4-ab7f-42605e13127c" containerName="nova-cell1-conductor-db-sync" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.930786 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.930908 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.938260 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.949195 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9g5kv"] Mar 20 08:48:03 crc kubenswrapper[4903]: I0320 08:48:03.994259 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9g5kv"] Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.020210 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.020472 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerName="nova-api-log" containerID="cri-o://67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189" gracePeriod=30 Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.020997 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerName="nova-api-api" containerID="cri-o://74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5" gracePeriod=30 Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.056894 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l2hb\" (UniqueName: \"kubernetes.io/projected/5e072c5e-0f44-4d24-bccc-b14bf61fa192-kube-api-access-9l2hb\") pod \"nova-cell1-conductor-0\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.056950 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.057397 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.072524 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.105720 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qstg"] Mar 20 08:48:04 crc kubenswrapper[4903]: W0320 08:48:04.112104 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1abf248d_f62b_4f0c_974c_8d724250e196.slice/crio-e65d481f3847e5823ea83434a5bd490d07f17c2d35999e8a352035fb490e6dc0 WatchSource:0}: Error finding container e65d481f3847e5823ea83434a5bd490d07f17c2d35999e8a352035fb490e6dc0: Status 404 returned error can't find the container with id e65d481f3847e5823ea83434a5bd490d07f17c2d35999e8a352035fb490e6dc0 Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.159617 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l2hb\" (UniqueName: \"kubernetes.io/projected/5e072c5e-0f44-4d24-bccc-b14bf61fa192-kube-api-access-9l2hb\") pod \"nova-cell1-conductor-0\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.159680 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.159786 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.168099 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.170558 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.186071 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l2hb\" (UniqueName: \"kubernetes.io/projected/5e072c5e-0f44-4d24-bccc-b14bf61fa192-kube-api-access-9l2hb\") pod \"nova-cell1-conductor-0\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.264237 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:04 crc kubenswrapper[4903]: W0320 08:48:04.777202 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e072c5e_0f44_4d24_bccc_b14bf61fa192.slice/crio-6a80a96d7215a6220aaaf7ae272d2f3f0136c00db333d81e2c40d6243b21fbee WatchSource:0}: Error finding container 6a80a96d7215a6220aaaf7ae272d2f3f0136c00db333d81e2c40d6243b21fbee: Status 404 returned error can't find the container with id 6a80a96d7215a6220aaaf7ae272d2f3f0136c00db333d81e2c40d6243b21fbee Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.794342 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.852536 4903 generic.go:334] "Generic (PLEG): container finished" podID="1abf248d-f62b-4f0c-974c-8d724250e196" containerID="f191d10b820d123124cb6bae2c0139a8505e7671cebb21299855e22baad4c371" exitCode=0 Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.852594 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qstg" event={"ID":"1abf248d-f62b-4f0c-974c-8d724250e196","Type":"ContainerDied","Data":"f191d10b820d123124cb6bae2c0139a8505e7671cebb21299855e22baad4c371"} Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.852620 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qstg" event={"ID":"1abf248d-f62b-4f0c-974c-8d724250e196","Type":"ContainerStarted","Data":"e65d481f3847e5823ea83434a5bd490d07f17c2d35999e8a352035fb490e6dc0"} Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.871213 4903 generic.go:334] "Generic (PLEG): container finished" podID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerID="67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189" exitCode=143 Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.871335 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82","Type":"ContainerDied","Data":"67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189"} Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.887067 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5e072c5e-0f44-4d24-bccc-b14bf61fa192","Type":"ContainerStarted","Data":"6a80a96d7215a6220aaaf7ae272d2f3f0136c00db333d81e2c40d6243b21fbee"} Mar 20 08:48:04 crc kubenswrapper[4903]: I0320 08:48:04.887133 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="49e6a095-3c2f-424c-9bd5-3b59e58550ea" containerName="nova-scheduler-scheduler" containerID="cri-o://b26c794e72d66a4bd3e16c59e65cb95bd12e7b96ab3cb5149c5daa4ecf5aa5a6" gracePeriod=30 Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.260157 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566608-982dt" Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.389902 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cxpg\" (UniqueName: \"kubernetes.io/projected/403ddea3-182d-4078-ae29-8bf03ce54cb5-kube-api-access-5cxpg\") pod \"403ddea3-182d-4078-ae29-8bf03ce54cb5\" (UID: \"403ddea3-182d-4078-ae29-8bf03ce54cb5\") " Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.401050 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403ddea3-182d-4078-ae29-8bf03ce54cb5-kube-api-access-5cxpg" (OuterVolumeSpecName: "kube-api-access-5cxpg") pod "403ddea3-182d-4078-ae29-8bf03ce54cb5" (UID: "403ddea3-182d-4078-ae29-8bf03ce54cb5"). InnerVolumeSpecName "kube-api-access-5cxpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.495339 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cxpg\" (UniqueName: \"kubernetes.io/projected/403ddea3-182d-4078-ae29-8bf03ce54cb5-kube-api-access-5cxpg\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.501393 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f94af42-d7a9-437c-a74b-5d63fcd63a50" path="/var/lib/kubelet/pods/6f94af42-d7a9-437c-a74b-5d63fcd63a50/volumes" Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.890648 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566602-66tvt"] Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.905369 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566608-982dt" event={"ID":"403ddea3-182d-4078-ae29-8bf03ce54cb5","Type":"ContainerDied","Data":"ad023f5968cf9328cd2aa1f693702748a6de0da6d8eba237227da00fa0206f6e"} Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.905436 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad023f5968cf9328cd2aa1f693702748a6de0da6d8eba237227da00fa0206f6e" Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.905391 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566608-982dt" Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.909482 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5e072c5e-0f44-4d24-bccc-b14bf61fa192","Type":"ContainerStarted","Data":"9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152"} Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.909588 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.914223 4903 generic.go:334] "Generic (PLEG): container finished" podID="1abf248d-f62b-4f0c-974c-8d724250e196" containerID="34c6f37b2632d6c4216a7bd9baf41728c9cde6351df93ff34efb424ea0d77999" exitCode=0 Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.914278 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qstg" event={"ID":"1abf248d-f62b-4f0c-974c-8d724250e196","Type":"ContainerDied","Data":"34c6f37b2632d6c4216a7bd9baf41728c9cde6351df93ff34efb424ea0d77999"} Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.922739 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566602-66tvt"] Mar 20 08:48:05 crc kubenswrapper[4903]: I0320 08:48:05.974695 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.974664925 podStartE2EDuration="2.974664925s" podCreationTimestamp="2026-03-20 08:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:05.932345197 +0000 UTC m=+1511.149245532" watchObservedRunningTime="2026-03-20 08:48:05.974664925 +0000 UTC m=+1511.191565240" Mar 20 08:48:06 crc kubenswrapper[4903]: I0320 08:48:06.927561 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qstg" event={"ID":"1abf248d-f62b-4f0c-974c-8d724250e196","Type":"ContainerStarted","Data":"96af21da30826328e1be16f242c3e9b74dd4e1f62e910536ca0dcaa58a288b0e"} Mar 20 08:48:06 crc kubenswrapper[4903]: I0320 08:48:06.945866 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9qstg" podStartSLOduration=2.48171509 podStartE2EDuration="3.945849574s" podCreationTimestamp="2026-03-20 08:48:03 +0000 UTC" firstStartedPulling="2026-03-20 08:48:04.855963049 +0000 UTC m=+1510.072863364" lastFinishedPulling="2026-03-20 08:48:06.320097523 +0000 UTC m=+1511.536997848" observedRunningTime="2026-03-20 08:48:06.942655706 +0000 UTC m=+1512.159556021" watchObservedRunningTime="2026-03-20 08:48:06.945849574 +0000 UTC m=+1512.162749889" Mar 20 08:48:07 crc kubenswrapper[4903]: E0320 08:48:07.182707 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b26c794e72d66a4bd3e16c59e65cb95bd12e7b96ab3cb5149c5daa4ecf5aa5a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 08:48:07 crc kubenswrapper[4903]: E0320 08:48:07.185254 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b26c794e72d66a4bd3e16c59e65cb95bd12e7b96ab3cb5149c5daa4ecf5aa5a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 08:48:07 crc kubenswrapper[4903]: E0320 08:48:07.187321 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b26c794e72d66a4bd3e16c59e65cb95bd12e7b96ab3cb5149c5daa4ecf5aa5a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 08:48:07 crc kubenswrapper[4903]: E0320 08:48:07.187384 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="49e6a095-3c2f-424c-9bd5-3b59e58550ea" containerName="nova-scheduler-scheduler" Mar 20 08:48:07 crc kubenswrapper[4903]: I0320 08:48:07.516254 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ab3243-abaa-4705-88ec-4b998774e880" path="/var/lib/kubelet/pods/39ab3243-abaa-4705-88ec-4b998774e880/volumes" Mar 20 08:48:07 crc kubenswrapper[4903]: I0320 08:48:07.941557 4903 generic.go:334] "Generic (PLEG): container finished" podID="49e6a095-3c2f-424c-9bd5-3b59e58550ea" containerID="b26c794e72d66a4bd3e16c59e65cb95bd12e7b96ab3cb5149c5daa4ecf5aa5a6" exitCode=0 Mar 20 08:48:07 crc kubenswrapper[4903]: I0320 08:48:07.942063 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49e6a095-3c2f-424c-9bd5-3b59e58550ea","Type":"ContainerDied","Data":"b26c794e72d66a4bd3e16c59e65cb95bd12e7b96ab3cb5149c5daa4ecf5aa5a6"} Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.296767 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.408937 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-config-data\") pod \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.409498 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww58l\" (UniqueName: \"kubernetes.io/projected/49e6a095-3c2f-424c-9bd5-3b59e58550ea-kube-api-access-ww58l\") pod \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.409618 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-combined-ca-bundle\") pod \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\" (UID: \"49e6a095-3c2f-424c-9bd5-3b59e58550ea\") " Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.416301 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e6a095-3c2f-424c-9bd5-3b59e58550ea-kube-api-access-ww58l" (OuterVolumeSpecName: "kube-api-access-ww58l") pod "49e6a095-3c2f-424c-9bd5-3b59e58550ea" (UID: "49e6a095-3c2f-424c-9bd5-3b59e58550ea"). InnerVolumeSpecName "kube-api-access-ww58l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.446242 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-config-data" (OuterVolumeSpecName: "config-data") pod "49e6a095-3c2f-424c-9bd5-3b59e58550ea" (UID: "49e6a095-3c2f-424c-9bd5-3b59e58550ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.459219 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49e6a095-3c2f-424c-9bd5-3b59e58550ea" (UID: "49e6a095-3c2f-424c-9bd5-3b59e58550ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.512511 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww58l\" (UniqueName: \"kubernetes.io/projected/49e6a095-3c2f-424c-9bd5-3b59e58550ea-kube-api-access-ww58l\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.512572 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.512591 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e6a095-3c2f-424c-9bd5-3b59e58550ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.956516 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49e6a095-3c2f-424c-9bd5-3b59e58550ea","Type":"ContainerDied","Data":"fd9263165ffef2ed52947676f95a8683d4b64bb577b75a55fb92319127e4d603"} Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.956582 4903 scope.go:117] "RemoveContainer" containerID="b26c794e72d66a4bd3e16c59e65cb95bd12e7b96ab3cb5149c5daa4ecf5aa5a6" Mar 20 08:48:08 crc kubenswrapper[4903]: I0320 08:48:08.956618 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.018514 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.029603 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.049279 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:09 crc kubenswrapper[4903]: E0320 08:48:09.049746 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e6a095-3c2f-424c-9bd5-3b59e58550ea" containerName="nova-scheduler-scheduler" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.049761 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e6a095-3c2f-424c-9bd5-3b59e58550ea" containerName="nova-scheduler-scheduler" Mar 20 08:48:09 crc kubenswrapper[4903]: E0320 08:48:09.049791 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403ddea3-182d-4078-ae29-8bf03ce54cb5" containerName="oc" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.049799 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="403ddea3-182d-4078-ae29-8bf03ce54cb5" containerName="oc" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.050020 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="403ddea3-182d-4078-ae29-8bf03ce54cb5" containerName="oc" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.050071 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e6a095-3c2f-424c-9bd5-3b59e58550ea" containerName="nova-scheduler-scheduler" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.050816 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.059395 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.068628 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.225704 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.226582 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-config-data\") pod \"nova-scheduler-0\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.226720 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnl4s\" (UniqueName: \"kubernetes.io/projected/700d5965-30b6-4fad-b808-f7a4ed433b9b-kube-api-access-fnl4s\") pod \"nova-scheduler-0\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.328517 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnl4s\" (UniqueName: \"kubernetes.io/projected/700d5965-30b6-4fad-b808-f7a4ed433b9b-kube-api-access-fnl4s\") pod \"nova-scheduler-0\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.328673 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.328720 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-config-data\") pod \"nova-scheduler-0\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.334403 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.334634 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-config-data\") pod \"nova-scheduler-0\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.366837 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnl4s\" (UniqueName: \"kubernetes.io/projected/700d5965-30b6-4fad-b808-f7a4ed433b9b-kube-api-access-fnl4s\") pod \"nova-scheduler-0\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.367985 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.513022 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e6a095-3c2f-424c-9bd5-3b59e58550ea" path="/var/lib/kubelet/pods/49e6a095-3c2f-424c-9bd5-3b59e58550ea/volumes" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.674523 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:09 crc kubenswrapper[4903]: W0320 08:48:09.687495 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod700d5965_30b6_4fad_b808_f7a4ed433b9b.slice/crio-7ffee58bae19f421994d8e8ea67b213e6fc2ddc1324a4d6fc04ad49a8ff92238 WatchSource:0}: Error finding container 7ffee58bae19f421994d8e8ea67b213e6fc2ddc1324a4d6fc04ad49a8ff92238: Status 404 returned error can't find the container with id 7ffee58bae19f421994d8e8ea67b213e6fc2ddc1324a4d6fc04ad49a8ff92238 Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.783349 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.783409 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.882880 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.978551 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"700d5965-30b6-4fad-b808-f7a4ed433b9b","Type":"ContainerStarted","Data":"2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69"} Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.978593 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"700d5965-30b6-4fad-b808-f7a4ed433b9b","Type":"ContainerStarted","Data":"7ffee58bae19f421994d8e8ea67b213e6fc2ddc1324a4d6fc04ad49a8ff92238"} Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.984294 4903 generic.go:334] "Generic (PLEG): container finished" podID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerID="74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5" exitCode=0 Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.984460 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82","Type":"ContainerDied","Data":"74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5"} Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.984672 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82","Type":"ContainerDied","Data":"86f51b773891445939dfe718ff1ba6015122b275456c6cdab0e3e719734c16fe"} Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.984699 4903 scope.go:117] "RemoveContainer" containerID="74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5" Mar 20 08:48:09 crc kubenswrapper[4903]: I0320 08:48:09.984516 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.007073 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.00705673 podStartE2EDuration="1.00705673s" podCreationTimestamp="2026-03-20 08:48:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:09.99930508 +0000 UTC m=+1515.216205405" watchObservedRunningTime="2026-03-20 08:48:10.00705673 +0000 UTC m=+1515.223957045" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.019440 4903 scope.go:117] "RemoveContainer" containerID="67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.043187 4903 scope.go:117] "RemoveContainer" containerID="74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5" Mar 20 08:48:10 crc kubenswrapper[4903]: E0320 08:48:10.043613 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5\": container with ID starting with 74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5 not found: ID does not exist" containerID="74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.043650 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5"} err="failed to get container status \"74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5\": rpc error: code = NotFound desc = could not find container \"74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5\": container with ID starting with 74f32b262e8212c4e2f8e83ecb47fdfcc1ab663e70ea9fdf0514f8ae55bdd4d5 not found: ID does not exist" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.043676 4903 scope.go:117] "RemoveContainer" containerID="67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189" Mar 20 08:48:10 crc kubenswrapper[4903]: E0320 08:48:10.044305 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189\": container with ID starting with 67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189 not found: ID does not exist" containerID="67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.044356 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189"} err="failed to get container status \"67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189\": rpc error: code = NotFound desc = could not find container \"67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189\": container with ID starting with 67d0b1a192bf9b0cb84d48b1f38b29ccf08db31018b839819bf5bd66728a6189 not found: ID does not exist" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.068356 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwmqh\" (UniqueName: \"kubernetes.io/projected/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-kube-api-access-wwmqh\") pod \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.069430 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-config-data\") pod \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.070053 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-logs\") pod \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.070094 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-combined-ca-bundle\") pod \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\" (UID: \"1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82\") " Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.070827 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-logs" (OuterVolumeSpecName: "logs") pod "1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" (UID: "1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.072131 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.075068 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-kube-api-access-wwmqh" (OuterVolumeSpecName: "kube-api-access-wwmqh") pod "1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" (UID: "1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82"). InnerVolumeSpecName "kube-api-access-wwmqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.100869 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" (UID: "1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.119438 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-config-data" (OuterVolumeSpecName: "config-data") pod "1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" (UID: "1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.140760 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.140831 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.174376 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.174410 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.174421 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwmqh\" (UniqueName: \"kubernetes.io/projected/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82-kube-api-access-wwmqh\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.342737 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.353131 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.378837 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:10 crc kubenswrapper[4903]: E0320 08:48:10.379270 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerName="nova-api-log" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.379292 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerName="nova-api-log" Mar 20 08:48:10 crc kubenswrapper[4903]: E0320 08:48:10.379323 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerName="nova-api-api" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.379332 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerName="nova-api-api" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.380787 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerName="nova-api-api" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.380849 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" containerName="nova-api-log" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.382194 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.398381 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.413564 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.487024 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr6p8\" (UniqueName: \"kubernetes.io/projected/a1539d60-53fa-4562-a356-83060d4f6bd7-kube-api-access-hr6p8\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.487134 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.487232 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-config-data\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.487260 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1539d60-53fa-4562-a356-83060d4f6bd7-logs\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.588640 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr6p8\" (UniqueName: \"kubernetes.io/projected/a1539d60-53fa-4562-a356-83060d4f6bd7-kube-api-access-hr6p8\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.589641 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.591117 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-config-data\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.591631 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1539d60-53fa-4562-a356-83060d4f6bd7-logs\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.607174 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1539d60-53fa-4562-a356-83060d4f6bd7-logs\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.608403 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-config-data\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.608397 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.613338 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr6p8\" (UniqueName: \"kubernetes.io/projected/a1539d60-53fa-4562-a356-83060d4f6bd7-kube-api-access-hr6p8\") pod \"nova-api-0\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " pod="openstack/nova-api-0" Mar 20 08:48:10 crc kubenswrapper[4903]: I0320 08:48:10.706246 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:11 crc kubenswrapper[4903]: I0320 08:48:11.511306 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82" path="/var/lib/kubelet/pods/1b5c26ba-26fd-4ad8-bd73-de0c5faf7a82/volumes" Mar 20 08:48:12 crc kubenswrapper[4903]: I0320 08:48:12.097528 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:13 crc kubenswrapper[4903]: I0320 08:48:13.028326 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1539d60-53fa-4562-a356-83060d4f6bd7","Type":"ContainerStarted","Data":"4bd143470e9e9d2aafb3e8563bf4776f7153e511f05b78f431490e0e5a7f2037"} Mar 20 08:48:13 crc kubenswrapper[4903]: I0320 08:48:13.028943 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1539d60-53fa-4562-a356-83060d4f6bd7","Type":"ContainerStarted","Data":"3164db999ab3f287862cebc8b53b9d8e2b543bbbb115a647ca01e915360191da"} Mar 20 08:48:13 crc kubenswrapper[4903]: I0320 08:48:13.028957 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1539d60-53fa-4562-a356-83060d4f6bd7","Type":"ContainerStarted","Data":"f9fec86750738444ea32f5616e67f20857935fd72709962a1ff734dff2430998"} Mar 20 08:48:13 crc kubenswrapper[4903]: I0320 08:48:13.514566 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:13 crc kubenswrapper[4903]: I0320 08:48:13.514710 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:13 crc kubenswrapper[4903]: I0320 08:48:13.587264 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:13 crc kubenswrapper[4903]: I0320 08:48:13.627077 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.627001104 podStartE2EDuration="3.627001104s" podCreationTimestamp="2026-03-20 08:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:13.060750081 +0000 UTC m=+1518.277650406" watchObservedRunningTime="2026-03-20 08:48:13.627001104 +0000 UTC m=+1518.843901459" Mar 20 08:48:14 crc kubenswrapper[4903]: I0320 08:48:14.128057 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:14 crc kubenswrapper[4903]: I0320 08:48:14.200673 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qstg"] Mar 20 08:48:14 crc kubenswrapper[4903]: I0320 08:48:14.307080 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 08:48:14 crc kubenswrapper[4903]: I0320 08:48:14.368864 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 08:48:16 crc kubenswrapper[4903]: I0320 08:48:16.067020 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9qstg" podUID="1abf248d-f62b-4f0c-974c-8d724250e196" containerName="registry-server" containerID="cri-o://96af21da30826328e1be16f242c3e9b74dd4e1f62e910536ca0dcaa58a288b0e" gracePeriod=2 Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.079844 4903 generic.go:334] "Generic (PLEG): container finished" podID="1abf248d-f62b-4f0c-974c-8d724250e196" containerID="96af21da30826328e1be16f242c3e9b74dd4e1f62e910536ca0dcaa58a288b0e" exitCode=0 Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.079931 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qstg" event={"ID":"1abf248d-f62b-4f0c-974c-8d724250e196","Type":"ContainerDied","Data":"96af21da30826328e1be16f242c3e9b74dd4e1f62e910536ca0dcaa58a288b0e"} Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.303581 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.372569 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d574q\" (UniqueName: \"kubernetes.io/projected/1abf248d-f62b-4f0c-974c-8d724250e196-kube-api-access-d574q\") pod \"1abf248d-f62b-4f0c-974c-8d724250e196\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.373140 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-catalog-content\") pod \"1abf248d-f62b-4f0c-974c-8d724250e196\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.373406 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-utilities\") pod \"1abf248d-f62b-4f0c-974c-8d724250e196\" (UID: \"1abf248d-f62b-4f0c-974c-8d724250e196\") " Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.375069 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-utilities" (OuterVolumeSpecName: "utilities") pod "1abf248d-f62b-4f0c-974c-8d724250e196" (UID: "1abf248d-f62b-4f0c-974c-8d724250e196"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.381618 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abf248d-f62b-4f0c-974c-8d724250e196-kube-api-access-d574q" (OuterVolumeSpecName: "kube-api-access-d574q") pod "1abf248d-f62b-4f0c-974c-8d724250e196" (UID: "1abf248d-f62b-4f0c-974c-8d724250e196"). InnerVolumeSpecName "kube-api-access-d574q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.418183 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1abf248d-f62b-4f0c-974c-8d724250e196" (UID: "1abf248d-f62b-4f0c-974c-8d724250e196"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.476643 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.476677 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1abf248d-f62b-4f0c-974c-8d724250e196-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:17 crc kubenswrapper[4903]: I0320 08:48:17.476724 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d574q\" (UniqueName: \"kubernetes.io/projected/1abf248d-f62b-4f0c-974c-8d724250e196-kube-api-access-d574q\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:18 crc kubenswrapper[4903]: I0320 08:48:18.092291 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qstg" event={"ID":"1abf248d-f62b-4f0c-974c-8d724250e196","Type":"ContainerDied","Data":"e65d481f3847e5823ea83434a5bd490d07f17c2d35999e8a352035fb490e6dc0"} Mar 20 08:48:18 crc kubenswrapper[4903]: I0320 08:48:18.092357 4903 scope.go:117] "RemoveContainer" containerID="96af21da30826328e1be16f242c3e9b74dd4e1f62e910536ca0dcaa58a288b0e" Mar 20 08:48:18 crc kubenswrapper[4903]: I0320 08:48:18.092422 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qstg" Mar 20 08:48:18 crc kubenswrapper[4903]: I0320 08:48:18.118992 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qstg"] Mar 20 08:48:18 crc kubenswrapper[4903]: I0320 08:48:18.127282 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qstg"] Mar 20 08:48:18 crc kubenswrapper[4903]: I0320 08:48:18.132660 4903 scope.go:117] "RemoveContainer" containerID="34c6f37b2632d6c4216a7bd9baf41728c9cde6351df93ff34efb424ea0d77999" Mar 20 08:48:18 crc kubenswrapper[4903]: I0320 08:48:18.156239 4903 scope.go:117] "RemoveContainer" containerID="f191d10b820d123124cb6bae2c0139a8505e7671cebb21299855e22baad4c371" Mar 20 08:48:19 crc kubenswrapper[4903]: I0320 08:48:19.368662 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 08:48:19 crc kubenswrapper[4903]: I0320 08:48:19.402467 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 08:48:19 crc kubenswrapper[4903]: I0320 08:48:19.506022 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abf248d-f62b-4f0c-974c-8d724250e196" path="/var/lib/kubelet/pods/1abf248d-f62b-4f0c-974c-8d724250e196/volumes" Mar 20 08:48:20 crc kubenswrapper[4903]: I0320 08:48:20.154711 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 08:48:20 crc kubenswrapper[4903]: I0320 08:48:20.707303 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:48:20 crc kubenswrapper[4903]: I0320 08:48:20.707668 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:48:20 crc kubenswrapper[4903]: I0320 08:48:20.837451 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:48:20 crc kubenswrapper[4903]: I0320 08:48:20.837554 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:48:21 crc kubenswrapper[4903]: I0320 08:48:21.790282 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:21 crc kubenswrapper[4903]: I0320 08:48:21.790318 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.172867 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.178506 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.202361 4903 generic.go:334] "Generic (PLEG): container finished" podID="fbeea61a-87b9-4a74-ab2b-78e4eceff3e4" containerID="c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230" exitCode=137 Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.202437 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4","Type":"ContainerDied","Data":"c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230"} Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.202470 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4","Type":"ContainerDied","Data":"f97e9ab87926808ccd7d4f5e30afa73f826b84b5e0c50a9657f7aff6ba86d2e4"} Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.202493 4903 scope.go:117] "RemoveContainer" containerID="c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.202630 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.207847 4903 generic.go:334] "Generic (PLEG): container finished" podID="6e5503e6-7030-40fa-b8f5-88476868ba0d" containerID="e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330" exitCode=137 Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.207888 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e5503e6-7030-40fa-b8f5-88476868ba0d","Type":"ContainerDied","Data":"e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330"} Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.207930 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.207948 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e5503e6-7030-40fa-b8f5-88476868ba0d","Type":"ContainerDied","Data":"150f7e8fd339b7e30774738595f07278cbd963a7a2db3827f0fc5c95f4f569f0"} Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.236805 4903 scope.go:117] "RemoveContainer" containerID="c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230" Mar 20 08:48:27 crc kubenswrapper[4903]: E0320 08:48:27.237270 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230\": container with ID starting with c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230 not found: ID does not exist" containerID="c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.237367 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230"} err="failed to get container status \"c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230\": rpc error: code = NotFound desc = could not find container \"c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230\": container with ID starting with c983d1eeaa2a317147c78e9b95de74cb47d6f3e9a657819edf2a16feae798230 not found: ID does not exist" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.237397 4903 scope.go:117] "RemoveContainer" containerID="e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.263930 4903 scope.go:117] "RemoveContainer" containerID="08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.281983 4903 scope.go:117] "RemoveContainer" containerID="e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330" Mar 20 08:48:27 crc kubenswrapper[4903]: E0320 08:48:27.282678 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330\": container with ID starting with e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330 not found: ID does not exist" containerID="e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.282767 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330"} err="failed to get container status \"e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330\": rpc error: code = NotFound desc = could not find container \"e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330\": container with ID starting with e4335e94975560c47e4da01c0e8e142b299cf426257328cfb82a22a9f7f75330 not found: ID does not exist" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.282806 4903 scope.go:117] "RemoveContainer" containerID="08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96" Mar 20 08:48:27 crc kubenswrapper[4903]: E0320 08:48:27.283265 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96\": container with ID starting with 08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96 not found: ID does not exist" containerID="08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.283322 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96"} err="failed to get container status \"08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96\": rpc error: code = NotFound desc = could not find container \"08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96\": container with ID starting with 08a894ad7cfd6c3a1c29cd812a76be8bb5b596f7d5bfd720ad60a0e168fd7a96 not found: ID does not exist" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.297380 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e5503e6-7030-40fa-b8f5-88476868ba0d-logs\") pod \"6e5503e6-7030-40fa-b8f5-88476868ba0d\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.297432 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbklx\" (UniqueName: \"kubernetes.io/projected/6e5503e6-7030-40fa-b8f5-88476868ba0d-kube-api-access-dbklx\") pod \"6e5503e6-7030-40fa-b8f5-88476868ba0d\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.297475 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-combined-ca-bundle\") pod \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.297542 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-combined-ca-bundle\") pod \"6e5503e6-7030-40fa-b8f5-88476868ba0d\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.297624 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-config-data\") pod \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.297683 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v5t4\" (UniqueName: \"kubernetes.io/projected/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-kube-api-access-9v5t4\") pod \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\" (UID: \"fbeea61a-87b9-4a74-ab2b-78e4eceff3e4\") " Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.297740 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-config-data\") pod \"6e5503e6-7030-40fa-b8f5-88476868ba0d\" (UID: \"6e5503e6-7030-40fa-b8f5-88476868ba0d\") " Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.297906 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e5503e6-7030-40fa-b8f5-88476868ba0d-logs" (OuterVolumeSpecName: "logs") pod "6e5503e6-7030-40fa-b8f5-88476868ba0d" (UID: "6e5503e6-7030-40fa-b8f5-88476868ba0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.298157 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e5503e6-7030-40fa-b8f5-88476868ba0d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.304125 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e5503e6-7030-40fa-b8f5-88476868ba0d-kube-api-access-dbklx" (OuterVolumeSpecName: "kube-api-access-dbklx") pod "6e5503e6-7030-40fa-b8f5-88476868ba0d" (UID: "6e5503e6-7030-40fa-b8f5-88476868ba0d"). InnerVolumeSpecName "kube-api-access-dbklx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.304225 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-kube-api-access-9v5t4" (OuterVolumeSpecName: "kube-api-access-9v5t4") pod "fbeea61a-87b9-4a74-ab2b-78e4eceff3e4" (UID: "fbeea61a-87b9-4a74-ab2b-78e4eceff3e4"). InnerVolumeSpecName "kube-api-access-9v5t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.325474 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-config-data" (OuterVolumeSpecName: "config-data") pod "fbeea61a-87b9-4a74-ab2b-78e4eceff3e4" (UID: "fbeea61a-87b9-4a74-ab2b-78e4eceff3e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.325959 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e5503e6-7030-40fa-b8f5-88476868ba0d" (UID: "6e5503e6-7030-40fa-b8f5-88476868ba0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.330559 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbeea61a-87b9-4a74-ab2b-78e4eceff3e4" (UID: "fbeea61a-87b9-4a74-ab2b-78e4eceff3e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.339192 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-config-data" (OuterVolumeSpecName: "config-data") pod "6e5503e6-7030-40fa-b8f5-88476868ba0d" (UID: "6e5503e6-7030-40fa-b8f5-88476868ba0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.400231 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.400281 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.400294 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v5t4\" (UniqueName: \"kubernetes.io/projected/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-kube-api-access-9v5t4\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.400305 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e5503e6-7030-40fa-b8f5-88476868ba0d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.400316 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbklx\" (UniqueName: \"kubernetes.io/projected/6e5503e6-7030-40fa-b8f5-88476868ba0d-kube-api-access-dbklx\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.400328 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.581351 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.620718 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.642529 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.650386 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.660498 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:48:27 crc kubenswrapper[4903]: E0320 08:48:27.660988 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abf248d-f62b-4f0c-974c-8d724250e196" containerName="extract-utilities" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.661007 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abf248d-f62b-4f0c-974c-8d724250e196" containerName="extract-utilities" Mar 20 08:48:27 crc kubenswrapper[4903]: E0320 08:48:27.661023 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbeea61a-87b9-4a74-ab2b-78e4eceff3e4" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.661046 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbeea61a-87b9-4a74-ab2b-78e4eceff3e4" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 08:48:27 crc kubenswrapper[4903]: E0320 08:48:27.661063 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abf248d-f62b-4f0c-974c-8d724250e196" containerName="registry-server" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.661069 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abf248d-f62b-4f0c-974c-8d724250e196" containerName="registry-server" Mar 20 08:48:27 crc kubenswrapper[4903]: E0320 08:48:27.661089 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e5503e6-7030-40fa-b8f5-88476868ba0d" containerName="nova-metadata-metadata" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.661095 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e5503e6-7030-40fa-b8f5-88476868ba0d" containerName="nova-metadata-metadata" Mar 20 08:48:27 crc kubenswrapper[4903]: E0320 08:48:27.661107 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e5503e6-7030-40fa-b8f5-88476868ba0d" containerName="nova-metadata-log" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.661124 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e5503e6-7030-40fa-b8f5-88476868ba0d" containerName="nova-metadata-log" Mar 20 08:48:27 crc kubenswrapper[4903]: E0320 08:48:27.661140 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abf248d-f62b-4f0c-974c-8d724250e196" containerName="extract-content" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.661145 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abf248d-f62b-4f0c-974c-8d724250e196" containerName="extract-content" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.661320 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbeea61a-87b9-4a74-ab2b-78e4eceff3e4" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.661335 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e5503e6-7030-40fa-b8f5-88476868ba0d" containerName="nova-metadata-metadata" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.661344 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e5503e6-7030-40fa-b8f5-88476868ba0d" containerName="nova-metadata-log" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.661354 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abf248d-f62b-4f0c-974c-8d724250e196" containerName="registry-server" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.662006 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.664042 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.664334 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.664358 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.669808 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.671675 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.673377 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.673602 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.680108 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.687976 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.815276 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.815349 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-config-data\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.815378 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.815403 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.815466 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttzs8\" (UniqueName: \"kubernetes.io/projected/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-kube-api-access-ttzs8\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.815504 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.815522 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-logs\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.815625 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2qzc\" (UniqueName: \"kubernetes.io/projected/3cdd4833-7200-46c0-9bb4-1b18c7828044-kube-api-access-p2qzc\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.815655 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.815677 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.917157 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.917554 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.917635 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttzs8\" (UniqueName: \"kubernetes.io/projected/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-kube-api-access-ttzs8\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.917670 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.917697 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-logs\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.917774 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2qzc\" (UniqueName: \"kubernetes.io/projected/3cdd4833-7200-46c0-9bb4-1b18c7828044-kube-api-access-p2qzc\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.917830 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.917871 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.917913 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.917959 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-config-data\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.919208 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-logs\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.922632 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.924288 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.924867 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-config-data\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.925971 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.926054 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.926351 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.927666 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.941393 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2qzc\" (UniqueName: \"kubernetes.io/projected/3cdd4833-7200-46c0-9bb4-1b18c7828044-kube-api-access-p2qzc\") pod \"nova-cell1-novncproxy-0\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.942173 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttzs8\" (UniqueName: \"kubernetes.io/projected/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-kube-api-access-ttzs8\") pod \"nova-metadata-0\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " pod="openstack/nova-metadata-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.984974 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:27 crc kubenswrapper[4903]: I0320 08:48:27.993319 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:48:28 crc kubenswrapper[4903]: W0320 08:48:28.279815 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ab04638_79a1_46f3_9e67_ba51ee0d12f7.slice/crio-da6942258e77637fc06615f76976803acb4a736bea98ee870e8a545e2e5cc449 WatchSource:0}: Error finding container da6942258e77637fc06615f76976803acb4a736bea98ee870e8a545e2e5cc449: Status 404 returned error can't find the container with id da6942258e77637fc06615f76976803acb4a736bea98ee870e8a545e2e5cc449 Mar 20 08:48:28 crc kubenswrapper[4903]: I0320 08:48:28.281946 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:28 crc kubenswrapper[4903]: I0320 08:48:28.538503 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:48:28 crc kubenswrapper[4903]: I0320 08:48:28.707855 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:48:28 crc kubenswrapper[4903]: I0320 08:48:28.707930 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:48:29 crc kubenswrapper[4903]: I0320 08:48:29.235646 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3cdd4833-7200-46c0-9bb4-1b18c7828044","Type":"ContainerStarted","Data":"9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077"} Mar 20 08:48:29 crc kubenswrapper[4903]: I0320 08:48:29.236000 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3cdd4833-7200-46c0-9bb4-1b18c7828044","Type":"ContainerStarted","Data":"a986c4bf7d083fcdad9e045170a88b4a1623b9e97068ea6e28387ae7d9354d6a"} Mar 20 08:48:29 crc kubenswrapper[4903]: I0320 08:48:29.239841 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ab04638-79a1-46f3-9e67-ba51ee0d12f7","Type":"ContainerStarted","Data":"14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854"} Mar 20 08:48:29 crc kubenswrapper[4903]: I0320 08:48:29.239902 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ab04638-79a1-46f3-9e67-ba51ee0d12f7","Type":"ContainerStarted","Data":"893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280"} Mar 20 08:48:29 crc kubenswrapper[4903]: I0320 08:48:29.239921 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ab04638-79a1-46f3-9e67-ba51ee0d12f7","Type":"ContainerStarted","Data":"da6942258e77637fc06615f76976803acb4a736bea98ee870e8a545e2e5cc449"} Mar 20 08:48:29 crc kubenswrapper[4903]: I0320 08:48:29.274165 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.274127275 podStartE2EDuration="2.274127275s" podCreationTimestamp="2026-03-20 08:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:29.258546173 +0000 UTC m=+1534.475446508" watchObservedRunningTime="2026-03-20 08:48:29.274127275 +0000 UTC m=+1534.491027640" Mar 20 08:48:29 crc kubenswrapper[4903]: I0320 08:48:29.294491 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.294468174 podStartE2EDuration="2.294468174s" podCreationTimestamp="2026-03-20 08:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:29.287435251 +0000 UTC m=+1534.504335586" watchObservedRunningTime="2026-03-20 08:48:29.294468174 +0000 UTC m=+1534.511368499" Mar 20 08:48:29 crc kubenswrapper[4903]: I0320 08:48:29.504384 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e5503e6-7030-40fa-b8f5-88476868ba0d" path="/var/lib/kubelet/pods/6e5503e6-7030-40fa-b8f5-88476868ba0d/volumes" Mar 20 08:48:29 crc kubenswrapper[4903]: I0320 08:48:29.505433 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbeea61a-87b9-4a74-ab2b-78e4eceff3e4" path="/var/lib/kubelet/pods/fbeea61a-87b9-4a74-ab2b-78e4eceff3e4/volumes" Mar 20 08:48:30 crc kubenswrapper[4903]: I0320 08:48:30.717726 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 08:48:30 crc kubenswrapper[4903]: I0320 08:48:30.718431 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 08:48:30 crc kubenswrapper[4903]: I0320 08:48:30.723096 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 08:48:30 crc kubenswrapper[4903]: I0320 08:48:30.725857 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 08:48:30 crc kubenswrapper[4903]: I0320 08:48:30.969084 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9bgcj"] Mar 20 08:48:30 crc kubenswrapper[4903]: I0320 08:48:30.971204 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.015226 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9bgcj"] Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.097070 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.097129 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.097159 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.097254 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.097283 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-config\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.097316 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj5pj\" (UniqueName: \"kubernetes.io/projected/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-kube-api-access-nj5pj\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.198774 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.198832 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.198872 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.198967 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.198992 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-config\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.199026 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj5pj\" (UniqueName: \"kubernetes.io/projected/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-kube-api-access-nj5pj\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.199723 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.199933 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.199989 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.200393 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.200721 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-config\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.226739 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj5pj\" (UniqueName: \"kubernetes.io/projected/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-kube-api-access-nj5pj\") pod \"dnsmasq-dns-89c5cd4d5-9bgcj\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.316189 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:31 crc kubenswrapper[4903]: I0320 08:48:31.780903 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9bgcj"] Mar 20 08:48:32 crc kubenswrapper[4903]: I0320 08:48:32.301633 4903 generic.go:334] "Generic (PLEG): container finished" podID="b286f9de-1973-4c7f-9350-4d3c31f9c1fb" containerID="af0bf0783940c5c71ca13e53f3a973ac3692e87a8058eafa8c1eec0382459e15" exitCode=0 Mar 20 08:48:32 crc kubenswrapper[4903]: I0320 08:48:32.301722 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" event={"ID":"b286f9de-1973-4c7f-9350-4d3c31f9c1fb","Type":"ContainerDied","Data":"af0bf0783940c5c71ca13e53f3a973ac3692e87a8058eafa8c1eec0382459e15"} Mar 20 08:48:32 crc kubenswrapper[4903]: I0320 08:48:32.302289 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" event={"ID":"b286f9de-1973-4c7f-9350-4d3c31f9c1fb","Type":"ContainerStarted","Data":"61a78e5d734a556376d557849909a1b7509d6e1bd7fec610c76f6f64d0091c34"} Mar 20 08:48:32 crc kubenswrapper[4903]: I0320 08:48:32.985748 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:33 crc kubenswrapper[4903]: I0320 08:48:33.148743 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:33 crc kubenswrapper[4903]: I0320 08:48:33.149471 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="ceilometer-central-agent" containerID="cri-o://e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275" gracePeriod=30 Mar 20 08:48:33 crc kubenswrapper[4903]: I0320 08:48:33.149519 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="proxy-httpd" containerID="cri-o://fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43" gracePeriod=30 Mar 20 08:48:33 crc kubenswrapper[4903]: I0320 08:48:33.149613 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="ceilometer-notification-agent" containerID="cri-o://9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e" gracePeriod=30 Mar 20 08:48:33 crc kubenswrapper[4903]: I0320 08:48:33.149509 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="sg-core" containerID="cri-o://3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65" gracePeriod=30 Mar 20 08:48:33 crc kubenswrapper[4903]: I0320 08:48:33.315763 4903 generic.go:334] "Generic (PLEG): container finished" podID="5328486f-b5ae-4da3-85f6-b70555303408" containerID="3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65" exitCode=2 Mar 20 08:48:33 crc kubenswrapper[4903]: I0320 08:48:33.315847 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5328486f-b5ae-4da3-85f6-b70555303408","Type":"ContainerDied","Data":"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65"} Mar 20 08:48:33 crc kubenswrapper[4903]: I0320 08:48:33.318558 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" event={"ID":"b286f9de-1973-4c7f-9350-4d3c31f9c1fb","Type":"ContainerStarted","Data":"1c1a4fb611dcbe8e6e5c2845b3c1260632a46a79e55826154d5b3126dc0b4a4e"} Mar 20 08:48:33 crc kubenswrapper[4903]: I0320 08:48:33.318776 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:33 crc kubenswrapper[4903]: I0320 08:48:33.344811 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" podStartSLOduration=3.344791758 podStartE2EDuration="3.344791758s" podCreationTimestamp="2026-03-20 08:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:33.336405143 +0000 UTC m=+1538.553305488" watchObservedRunningTime="2026-03-20 08:48:33.344791758 +0000 UTC m=+1538.561692083" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.137014 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.137607 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerName="nova-api-log" containerID="cri-o://3164db999ab3f287862cebc8b53b9d8e2b543bbbb115a647ca01e915360191da" gracePeriod=30 Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.137701 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerName="nova-api-api" containerID="cri-o://4bd143470e9e9d2aafb3e8563bf4776f7153e511f05b78f431490e0e5a7f2037" gracePeriod=30 Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.188283 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.280578 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-ceilometer-tls-certs\") pod \"5328486f-b5ae-4da3-85f6-b70555303408\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.280655 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-combined-ca-bundle\") pod \"5328486f-b5ae-4da3-85f6-b70555303408\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.280721 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwcdc\" (UniqueName: \"kubernetes.io/projected/5328486f-b5ae-4da3-85f6-b70555303408-kube-api-access-wwcdc\") pod \"5328486f-b5ae-4da3-85f6-b70555303408\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.280766 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-sg-core-conf-yaml\") pod \"5328486f-b5ae-4da3-85f6-b70555303408\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.280796 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-config-data\") pod \"5328486f-b5ae-4da3-85f6-b70555303408\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.280829 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-log-httpd\") pod \"5328486f-b5ae-4da3-85f6-b70555303408\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.280886 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-scripts\") pod \"5328486f-b5ae-4da3-85f6-b70555303408\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.281048 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-run-httpd\") pod \"5328486f-b5ae-4da3-85f6-b70555303408\" (UID: \"5328486f-b5ae-4da3-85f6-b70555303408\") " Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.281747 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5328486f-b5ae-4da3-85f6-b70555303408" (UID: "5328486f-b5ae-4da3-85f6-b70555303408"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.290372 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5328486f-b5ae-4da3-85f6-b70555303408" (UID: "5328486f-b5ae-4da3-85f6-b70555303408"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.313237 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5328486f-b5ae-4da3-85f6-b70555303408-kube-api-access-wwcdc" (OuterVolumeSpecName: "kube-api-access-wwcdc") pod "5328486f-b5ae-4da3-85f6-b70555303408" (UID: "5328486f-b5ae-4da3-85f6-b70555303408"). InnerVolumeSpecName "kube-api-access-wwcdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.345078 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-scripts" (OuterVolumeSpecName: "scripts") pod "5328486f-b5ae-4da3-85f6-b70555303408" (UID: "5328486f-b5ae-4da3-85f6-b70555303408"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.358018 4903 generic.go:334] "Generic (PLEG): container finished" podID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerID="3164db999ab3f287862cebc8b53b9d8e2b543bbbb115a647ca01e915360191da" exitCode=143 Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.358090 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1539d60-53fa-4562-a356-83060d4f6bd7","Type":"ContainerDied","Data":"3164db999ab3f287862cebc8b53b9d8e2b543bbbb115a647ca01e915360191da"} Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.362442 4903 generic.go:334] "Generic (PLEG): container finished" podID="5328486f-b5ae-4da3-85f6-b70555303408" containerID="fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43" exitCode=0 Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.362493 4903 generic.go:334] "Generic (PLEG): container finished" podID="5328486f-b5ae-4da3-85f6-b70555303408" containerID="9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e" exitCode=0 Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.362508 4903 generic.go:334] "Generic (PLEG): container finished" podID="5328486f-b5ae-4da3-85f6-b70555303408" containerID="e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275" exitCode=0 Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.363645 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.363624 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5328486f-b5ae-4da3-85f6-b70555303408","Type":"ContainerDied","Data":"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43"} Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.363868 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5328486f-b5ae-4da3-85f6-b70555303408","Type":"ContainerDied","Data":"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e"} Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.363904 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5328486f-b5ae-4da3-85f6-b70555303408","Type":"ContainerDied","Data":"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275"} Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.363915 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5328486f-b5ae-4da3-85f6-b70555303408","Type":"ContainerDied","Data":"b6f231603f05e477332c08bb45403cdf0cc95348973c80d198c9af0e061fd9cd"} Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.363938 4903 scope.go:117] "RemoveContainer" containerID="fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.383212 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.383248 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwcdc\" (UniqueName: \"kubernetes.io/projected/5328486f-b5ae-4da3-85f6-b70555303408-kube-api-access-wwcdc\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.383257 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5328486f-b5ae-4da3-85f6-b70555303408-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.383265 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.393946 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5328486f-b5ae-4da3-85f6-b70555303408" (UID: "5328486f-b5ae-4da3-85f6-b70555303408"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.417384 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5328486f-b5ae-4da3-85f6-b70555303408" (UID: "5328486f-b5ae-4da3-85f6-b70555303408"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.485941 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.485976 4903 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.484793 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-config-data" (OuterVolumeSpecName: "config-data") pod "5328486f-b5ae-4da3-85f6-b70555303408" (UID: "5328486f-b5ae-4da3-85f6-b70555303408"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.502660 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5328486f-b5ae-4da3-85f6-b70555303408" (UID: "5328486f-b5ae-4da3-85f6-b70555303408"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.526987 4903 scope.go:117] "RemoveContainer" containerID="3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.545798 4903 scope.go:117] "RemoveContainer" containerID="9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.569639 4903 scope.go:117] "RemoveContainer" containerID="e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.588062 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.588094 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5328486f-b5ae-4da3-85f6-b70555303408-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.590097 4903 scope.go:117] "RemoveContainer" containerID="fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43" Mar 20 08:48:34 crc kubenswrapper[4903]: E0320 08:48:34.590576 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43\": container with ID starting with fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43 not found: ID does not exist" containerID="fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.590615 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43"} err="failed to get container status \"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43\": rpc error: code = NotFound desc = could not find container \"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43\": container with ID starting with fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43 not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.590636 4903 scope.go:117] "RemoveContainer" containerID="3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65" Mar 20 08:48:34 crc kubenswrapper[4903]: E0320 08:48:34.590911 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65\": container with ID starting with 3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65 not found: ID does not exist" containerID="3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.590958 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65"} err="failed to get container status \"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65\": rpc error: code = NotFound desc = could not find container \"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65\": container with ID starting with 3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65 not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.590982 4903 scope.go:117] "RemoveContainer" containerID="9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e" Mar 20 08:48:34 crc kubenswrapper[4903]: E0320 08:48:34.592574 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e\": container with ID starting with 9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e not found: ID does not exist" containerID="9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.592621 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e"} err="failed to get container status \"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e\": rpc error: code = NotFound desc = could not find container \"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e\": container with ID starting with 9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.592658 4903 scope.go:117] "RemoveContainer" containerID="e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275" Mar 20 08:48:34 crc kubenswrapper[4903]: E0320 08:48:34.592977 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275\": container with ID starting with e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275 not found: ID does not exist" containerID="e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.593007 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275"} err="failed to get container status \"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275\": rpc error: code = NotFound desc = could not find container \"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275\": container with ID starting with e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275 not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.593040 4903 scope.go:117] "RemoveContainer" containerID="fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.593259 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43"} err="failed to get container status \"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43\": rpc error: code = NotFound desc = could not find container \"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43\": container with ID starting with fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43 not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.593285 4903 scope.go:117] "RemoveContainer" containerID="3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.593464 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65"} err="failed to get container status \"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65\": rpc error: code = NotFound desc = could not find container \"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65\": container with ID starting with 3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65 not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.593489 4903 scope.go:117] "RemoveContainer" containerID="9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.593779 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e"} err="failed to get container status \"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e\": rpc error: code = NotFound desc = could not find container \"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e\": container with ID starting with 9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.593835 4903 scope.go:117] "RemoveContainer" containerID="e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.594594 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275"} err="failed to get container status \"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275\": rpc error: code = NotFound desc = could not find container \"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275\": container with ID starting with e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275 not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.594620 4903 scope.go:117] "RemoveContainer" containerID="fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.594879 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43"} err="failed to get container status \"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43\": rpc error: code = NotFound desc = could not find container \"fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43\": container with ID starting with fdc15e675640090ef1b1e1d826699d0ca27ed7e7f1c8b54cc3b5e18791c18d43 not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.594914 4903 scope.go:117] "RemoveContainer" containerID="3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.595289 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65"} err="failed to get container status \"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65\": rpc error: code = NotFound desc = could not find container \"3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65\": container with ID starting with 3689fc59ace57fbe9423679424b534f88df2ca91f53220cb1de5ed5e1229be65 not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.595320 4903 scope.go:117] "RemoveContainer" containerID="9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.595568 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e"} err="failed to get container status \"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e\": rpc error: code = NotFound desc = could not find container \"9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e\": container with ID starting with 9370ac6e5ebdc5f0f3527fc8efc56386d4c6b5f76e0743a0ed27251b010b544e not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.595591 4903 scope.go:117] "RemoveContainer" containerID="e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.595795 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275"} err="failed to get container status \"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275\": rpc error: code = NotFound desc = could not find container \"e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275\": container with ID starting with e4a638edf350fe1afb3397fdebaae0ebb3dfe9a1ee679101ef18e91611509275 not found: ID does not exist" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.701130 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.706717 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.734153 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:34 crc kubenswrapper[4903]: E0320 08:48:34.734570 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="proxy-httpd" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.734588 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="proxy-httpd" Mar 20 08:48:34 crc kubenswrapper[4903]: E0320 08:48:34.734598 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="ceilometer-central-agent" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.734606 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="ceilometer-central-agent" Mar 20 08:48:34 crc kubenswrapper[4903]: E0320 08:48:34.734626 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="sg-core" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.734632 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="sg-core" Mar 20 08:48:34 crc kubenswrapper[4903]: E0320 08:48:34.734653 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="ceilometer-notification-agent" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.734660 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="ceilometer-notification-agent" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.734843 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="ceilometer-notification-agent" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.734867 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="sg-core" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.734878 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="ceilometer-central-agent" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.734891 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5328486f-b5ae-4da3-85f6-b70555303408" containerName="proxy-httpd" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.737756 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.743121 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.743418 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.744634 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.753798 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.896174 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-run-httpd\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.896299 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl88g\" (UniqueName: \"kubernetes.io/projected/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-kube-api-access-nl88g\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.896332 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.896375 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-config-data\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.896392 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.896413 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.896540 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-scripts\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.896600 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-log-httpd\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.998135 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl88g\" (UniqueName: \"kubernetes.io/projected/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-kube-api-access-nl88g\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.998209 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.998237 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-config-data\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.998256 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.998278 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.998366 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-scripts\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.998408 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-log-httpd\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.998426 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-run-httpd\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.999015 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-run-httpd\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:34 crc kubenswrapper[4903]: I0320 08:48:34.999282 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-log-httpd\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:35 crc kubenswrapper[4903]: I0320 08:48:35.002362 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:35 crc kubenswrapper[4903]: I0320 08:48:35.002392 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:35 crc kubenswrapper[4903]: I0320 08:48:35.002677 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-scripts\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:35 crc kubenswrapper[4903]: I0320 08:48:35.003367 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:35 crc kubenswrapper[4903]: I0320 08:48:35.004080 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-config-data\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:35 crc kubenswrapper[4903]: I0320 08:48:35.017867 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl88g\" (UniqueName: \"kubernetes.io/projected/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-kube-api-access-nl88g\") pod \"ceilometer-0\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " pod="openstack/ceilometer-0" Mar 20 08:48:35 crc kubenswrapper[4903]: I0320 08:48:35.116695 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:48:35 crc kubenswrapper[4903]: I0320 08:48:35.509937 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5328486f-b5ae-4da3-85f6-b70555303408" path="/var/lib/kubelet/pods/5328486f-b5ae-4da3-85f6-b70555303408/volumes" Mar 20 08:48:35 crc kubenswrapper[4903]: I0320 08:48:35.622675 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:35 crc kubenswrapper[4903]: W0320 08:48:35.633650 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc309ae_dfeb_4e11_8e1d_38092f5cd35c.slice/crio-02a9344e9860a1ce34a64ddfe3c7e071338227bc413e35735d011e50b3a59398 WatchSource:0}: Error finding container 02a9344e9860a1ce34a64ddfe3c7e071338227bc413e35735d011e50b3a59398: Status 404 returned error can't find the container with id 02a9344e9860a1ce34a64ddfe3c7e071338227bc413e35735d011e50b3a59398 Mar 20 08:48:35 crc kubenswrapper[4903]: I0320 08:48:35.638578 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:36 crc kubenswrapper[4903]: I0320 08:48:36.384920 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c","Type":"ContainerStarted","Data":"02a9344e9860a1ce34a64ddfe3c7e071338227bc413e35735d011e50b3a59398"} Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.404263 4903 generic.go:334] "Generic (PLEG): container finished" podID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerID="4bd143470e9e9d2aafb3e8563bf4776f7153e511f05b78f431490e0e5a7f2037" exitCode=0 Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.404353 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1539d60-53fa-4562-a356-83060d4f6bd7","Type":"ContainerDied","Data":"4bd143470e9e9d2aafb3e8563bf4776f7153e511f05b78f431490e0e5a7f2037"} Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.407614 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c","Type":"ContainerStarted","Data":"a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368"} Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.407663 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c","Type":"ContainerStarted","Data":"8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093"} Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.837882 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.973814 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-combined-ca-bundle\") pod \"a1539d60-53fa-4562-a356-83060d4f6bd7\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.974936 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-config-data\") pod \"a1539d60-53fa-4562-a356-83060d4f6bd7\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.975094 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1539d60-53fa-4562-a356-83060d4f6bd7-logs\") pod \"a1539d60-53fa-4562-a356-83060d4f6bd7\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.975279 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr6p8\" (UniqueName: \"kubernetes.io/projected/a1539d60-53fa-4562-a356-83060d4f6bd7-kube-api-access-hr6p8\") pod \"a1539d60-53fa-4562-a356-83060d4f6bd7\" (UID: \"a1539d60-53fa-4562-a356-83060d4f6bd7\") " Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.976301 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1539d60-53fa-4562-a356-83060d4f6bd7-logs" (OuterVolumeSpecName: "logs") pod "a1539d60-53fa-4562-a356-83060d4f6bd7" (UID: "a1539d60-53fa-4562-a356-83060d4f6bd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.985802 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.993701 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1539d60-53fa-4562-a356-83060d4f6bd7-kube-api-access-hr6p8" (OuterVolumeSpecName: "kube-api-access-hr6p8") pod "a1539d60-53fa-4562-a356-83060d4f6bd7" (UID: "a1539d60-53fa-4562-a356-83060d4f6bd7"). InnerVolumeSpecName "kube-api-access-hr6p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.993796 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 08:48:37 crc kubenswrapper[4903]: I0320 08:48:37.993828 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.023723 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-config-data" (OuterVolumeSpecName: "config-data") pod "a1539d60-53fa-4562-a356-83060d4f6bd7" (UID: "a1539d60-53fa-4562-a356-83060d4f6bd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.028651 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1539d60-53fa-4562-a356-83060d4f6bd7" (UID: "a1539d60-53fa-4562-a356-83060d4f6bd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.031691 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.077745 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1539d60-53fa-4562-a356-83060d4f6bd7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.077784 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr6p8\" (UniqueName: \"kubernetes.io/projected/a1539d60-53fa-4562-a356-83060d4f6bd7-kube-api-access-hr6p8\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.077794 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.077803 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1539d60-53fa-4562-a356-83060d4f6bd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.421007 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c","Type":"ContainerStarted","Data":"296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9"} Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.424240 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.424399 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a1539d60-53fa-4562-a356-83060d4f6bd7","Type":"ContainerDied","Data":"f9fec86750738444ea32f5616e67f20857935fd72709962a1ff734dff2430998"} Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.424439 4903 scope.go:117] "RemoveContainer" containerID="4bd143470e9e9d2aafb3e8563bf4776f7153e511f05b78f431490e0e5a7f2037" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.449821 4903 scope.go:117] "RemoveContainer" containerID="3164db999ab3f287862cebc8b53b9d8e2b543bbbb115a647ca01e915360191da" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.464577 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.476710 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.488644 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.503634 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:38 crc kubenswrapper[4903]: E0320 08:48:38.504192 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerName="nova-api-api" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.504207 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerName="nova-api-api" Mar 20 08:48:38 crc kubenswrapper[4903]: E0320 08:48:38.504227 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerName="nova-api-log" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.504233 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerName="nova-api-log" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.504510 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerName="nova-api-log" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.504524 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" containerName="nova-api-api" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.505677 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.533137 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.553210 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.553888 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.553974 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.586145 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-public-tls-certs\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.586278 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf64dcf-b799-4f37-84e1-8c43e1737259-logs\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.586338 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-config-data\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.586404 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54ljh\" (UniqueName: \"kubernetes.io/projected/dcf64dcf-b799-4f37-84e1-8c43e1737259-kube-api-access-54ljh\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.586446 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.586476 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.687807 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.687860 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.687943 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-public-tls-certs\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.687994 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf64dcf-b799-4f37-84e1-8c43e1737259-logs\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.688063 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-config-data\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.688097 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54ljh\" (UniqueName: \"kubernetes.io/projected/dcf64dcf-b799-4f37-84e1-8c43e1737259-kube-api-access-54ljh\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.689695 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf64dcf-b799-4f37-84e1-8c43e1737259-logs\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.695952 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-public-tls-certs\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.696484 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.697741 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-internal-tls-certs\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.708110 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-config-data\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.722077 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54ljh\" (UniqueName: \"kubernetes.io/projected/dcf64dcf-b799-4f37-84e1-8c43e1737259-kube-api-access-54ljh\") pod \"nova-api-0\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.745747 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-bff6c"] Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.758641 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.759554 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bff6c"] Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.760995 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.768324 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.791426 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgl7w\" (UniqueName: \"kubernetes.io/projected/7c102604-23bc-49f8-96ce-821603a4f4bf-kube-api-access-hgl7w\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.791524 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-config-data\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.791612 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-scripts\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.791930 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.894267 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgl7w\" (UniqueName: \"kubernetes.io/projected/7c102604-23bc-49f8-96ce-821603a4f4bf-kube-api-access-hgl7w\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.894353 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-config-data\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.894417 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-scripts\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.894440 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.896426 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.898280 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.909788 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-config-data\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.919426 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-scripts\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:38 crc kubenswrapper[4903]: I0320 08:48:38.923464 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgl7w\" (UniqueName: \"kubernetes.io/projected/7c102604-23bc-49f8-96ce-821603a4f4bf-kube-api-access-hgl7w\") pod \"nova-cell1-cell-mapping-bff6c\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:39 crc kubenswrapper[4903]: I0320 08:48:39.012291 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:39 crc kubenswrapper[4903]: I0320 08:48:39.012310 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:48:39 crc kubenswrapper[4903]: I0320 08:48:39.120122 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:39 crc kubenswrapper[4903]: I0320 08:48:39.484835 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:39 crc kubenswrapper[4903]: I0320 08:48:39.503246 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1539d60-53fa-4562-a356-83060d4f6bd7" path="/var/lib/kubelet/pods/a1539d60-53fa-4562-a356-83060d4f6bd7/volumes" Mar 20 08:48:39 crc kubenswrapper[4903]: I0320 08:48:39.548690 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bff6c"] Mar 20 08:48:39 crc kubenswrapper[4903]: W0320 08:48:39.571079 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcf64dcf_b799_4f37_84e1_8c43e1737259.slice/crio-4a20856b6fa5d9f3a5521b4a0695bc759e2b080d098f1425e7febe689fdc7aea WatchSource:0}: Error finding container 4a20856b6fa5d9f3a5521b4a0695bc759e2b080d098f1425e7febe689fdc7aea: Status 404 returned error can't find the container with id 4a20856b6fa5d9f3a5521b4a0695bc759e2b080d098f1425e7febe689fdc7aea Mar 20 08:48:39 crc kubenswrapper[4903]: W0320 08:48:39.586218 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c102604_23bc_49f8_96ce_821603a4f4bf.slice/crio-56b7d7caa9dc3d25393038bf63e820841e38c5caf018ff24fc5d062541e7ad4c WatchSource:0}: Error finding container 56b7d7caa9dc3d25393038bf63e820841e38c5caf018ff24fc5d062541e7ad4c: Status 404 returned error can't find the container with id 56b7d7caa9dc3d25393038bf63e820841e38c5caf018ff24fc5d062541e7ad4c Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.467958 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bff6c" event={"ID":"7c102604-23bc-49f8-96ce-821603a4f4bf","Type":"ContainerStarted","Data":"59e15fc31b0ce96747c05d477541c7c6a3fe03d93487ef381bf0b660eb845438"} Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.468269 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bff6c" event={"ID":"7c102604-23bc-49f8-96ce-821603a4f4bf","Type":"ContainerStarted","Data":"56b7d7caa9dc3d25393038bf63e820841e38c5caf018ff24fc5d062541e7ad4c"} Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.470449 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c","Type":"ContainerStarted","Data":"4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa"} Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.470490 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="ceilometer-central-agent" containerID="cri-o://8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093" gracePeriod=30 Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.470572 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="ceilometer-notification-agent" containerID="cri-o://a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368" gracePeriod=30 Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.470565 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="sg-core" containerID="cri-o://296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9" gracePeriod=30 Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.470599 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.470724 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="proxy-httpd" containerID="cri-o://4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa" gracePeriod=30 Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.474459 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcf64dcf-b799-4f37-84e1-8c43e1737259","Type":"ContainerStarted","Data":"55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11"} Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.474494 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcf64dcf-b799-4f37-84e1-8c43e1737259","Type":"ContainerStarted","Data":"19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869"} Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.474503 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcf64dcf-b799-4f37-84e1-8c43e1737259","Type":"ContainerStarted","Data":"4a20856b6fa5d9f3a5521b4a0695bc759e2b080d098f1425e7febe689fdc7aea"} Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.486480 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-bff6c" podStartSLOduration=2.486462907 podStartE2EDuration="2.486462907s" podCreationTimestamp="2026-03-20 08:48:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:40.483592237 +0000 UTC m=+1545.700492552" watchObservedRunningTime="2026-03-20 08:48:40.486462907 +0000 UTC m=+1545.703363222" Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.511892 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.51187264 podStartE2EDuration="2.51187264s" podCreationTimestamp="2026-03-20 08:48:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:40.504748075 +0000 UTC m=+1545.721648390" watchObservedRunningTime="2026-03-20 08:48:40.51187264 +0000 UTC m=+1545.728772955" Mar 20 08:48:40 crc kubenswrapper[4903]: I0320 08:48:40.546888 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.438634194 podStartE2EDuration="6.546860227s" podCreationTimestamp="2026-03-20 08:48:34 +0000 UTC" firstStartedPulling="2026-03-20 08:48:35.638659343 +0000 UTC m=+1540.855559658" lastFinishedPulling="2026-03-20 08:48:39.746885376 +0000 UTC m=+1544.963785691" observedRunningTime="2026-03-20 08:48:40.536991346 +0000 UTC m=+1545.753891661" watchObservedRunningTime="2026-03-20 08:48:40.546860227 +0000 UTC m=+1545.763760562" Mar 20 08:48:41 crc kubenswrapper[4903]: I0320 08:48:41.318280 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:48:41 crc kubenswrapper[4903]: I0320 08:48:41.409911 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-lrmtk"] Mar 20 08:48:41 crc kubenswrapper[4903]: I0320 08:48:41.410284 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" podUID="3610fd9b-fd0a-4b22-8a26-27a393bf92a6" containerName="dnsmasq-dns" containerID="cri-o://b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c" gracePeriod=10 Mar 20 08:48:41 crc kubenswrapper[4903]: I0320 08:48:41.508607 4903 generic.go:334] "Generic (PLEG): container finished" podID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerID="4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa" exitCode=0 Mar 20 08:48:41 crc kubenswrapper[4903]: I0320 08:48:41.509506 4903 generic.go:334] "Generic (PLEG): container finished" podID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerID="296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9" exitCode=2 Mar 20 08:48:41 crc kubenswrapper[4903]: I0320 08:48:41.509551 4903 generic.go:334] "Generic (PLEG): container finished" podID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerID="a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368" exitCode=0 Mar 20 08:48:41 crc kubenswrapper[4903]: I0320 08:48:41.521708 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c","Type":"ContainerDied","Data":"4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa"} Mar 20 08:48:41 crc kubenswrapper[4903]: I0320 08:48:41.521754 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c","Type":"ContainerDied","Data":"296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9"} Mar 20 08:48:41 crc kubenswrapper[4903]: I0320 08:48:41.521767 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c","Type":"ContainerDied","Data":"a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368"} Mar 20 08:48:41 crc kubenswrapper[4903]: E0320 08:48:41.568449 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3610fd9b_fd0a_4b22_8a26_27a393bf92a6.slice/crio-b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c.scope\": RecentStats: unable to find data in memory cache]" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.065286 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.180330 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn5gt\" (UniqueName: \"kubernetes.io/projected/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-kube-api-access-cn5gt\") pod \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.180482 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-config\") pod \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.180521 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-swift-storage-0\") pod \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.180561 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-svc\") pod \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.180632 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-nb\") pod \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.180730 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-sb\") pod \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\" (UID: \"3610fd9b-fd0a-4b22-8a26-27a393bf92a6\") " Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.192206 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-kube-api-access-cn5gt" (OuterVolumeSpecName: "kube-api-access-cn5gt") pod "3610fd9b-fd0a-4b22-8a26-27a393bf92a6" (UID: "3610fd9b-fd0a-4b22-8a26-27a393bf92a6"). InnerVolumeSpecName "kube-api-access-cn5gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.250570 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-config" (OuterVolumeSpecName: "config") pod "3610fd9b-fd0a-4b22-8a26-27a393bf92a6" (UID: "3610fd9b-fd0a-4b22-8a26-27a393bf92a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.255524 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3610fd9b-fd0a-4b22-8a26-27a393bf92a6" (UID: "3610fd9b-fd0a-4b22-8a26-27a393bf92a6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.262264 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3610fd9b-fd0a-4b22-8a26-27a393bf92a6" (UID: "3610fd9b-fd0a-4b22-8a26-27a393bf92a6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.263776 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3610fd9b-fd0a-4b22-8a26-27a393bf92a6" (UID: "3610fd9b-fd0a-4b22-8a26-27a393bf92a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.270196 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3610fd9b-fd0a-4b22-8a26-27a393bf92a6" (UID: "3610fd9b-fd0a-4b22-8a26-27a393bf92a6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.283243 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.283284 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.283295 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn5gt\" (UniqueName: \"kubernetes.io/projected/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-kube-api-access-cn5gt\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.283308 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.283318 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.283326 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3610fd9b-fd0a-4b22-8a26-27a393bf92a6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.521490 4903 generic.go:334] "Generic (PLEG): container finished" podID="3610fd9b-fd0a-4b22-8a26-27a393bf92a6" containerID="b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c" exitCode=0 Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.521531 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" event={"ID":"3610fd9b-fd0a-4b22-8a26-27a393bf92a6","Type":"ContainerDied","Data":"b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c"} Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.522008 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" event={"ID":"3610fd9b-fd0a-4b22-8a26-27a393bf92a6","Type":"ContainerDied","Data":"00784d5ee33314c79f8dbef53df595dea5427963b8de41b191ff04c375b7f027"} Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.522038 4903 scope.go:117] "RemoveContainer" containerID="b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.521583 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-lrmtk" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.600486 4903 scope.go:117] "RemoveContainer" containerID="f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.613954 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-lrmtk"] Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.623619 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-lrmtk"] Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.630279 4903 scope.go:117] "RemoveContainer" containerID="b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c" Mar 20 08:48:42 crc kubenswrapper[4903]: E0320 08:48:42.640235 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c\": container with ID starting with b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c not found: ID does not exist" containerID="b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.640294 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c"} err="failed to get container status \"b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c\": rpc error: code = NotFound desc = could not find container \"b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c\": container with ID starting with b6c4df4e12e3e3f6595273ba5c82b0cc28aa6465d6ecf7368ce865b9f1105f1c not found: ID does not exist" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.640322 4903 scope.go:117] "RemoveContainer" containerID="f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426" Mar 20 08:48:42 crc kubenswrapper[4903]: E0320 08:48:42.645697 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426\": container with ID starting with f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426 not found: ID does not exist" containerID="f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426" Mar 20 08:48:42 crc kubenswrapper[4903]: I0320 08:48:42.645763 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426"} err="failed to get container status \"f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426\": rpc error: code = NotFound desc = could not find container \"f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426\": container with ID starting with f9119b30af39e74f4c4c5de9e7696181e19478c65c222d393b4dcce841741426 not found: ID does not exist" Mar 20 08:48:43 crc kubenswrapper[4903]: I0320 08:48:43.505707 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3610fd9b-fd0a-4b22-8a26-27a393bf92a6" path="/var/lib/kubelet/pods/3610fd9b-fd0a-4b22-8a26-27a393bf92a6/volumes" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.256933 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.323702 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl88g\" (UniqueName: \"kubernetes.io/projected/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-kube-api-access-nl88g\") pod \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.323761 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-log-httpd\") pod \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.323828 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-sg-core-conf-yaml\") pod \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.323850 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-config-data\") pod \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.323911 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-scripts\") pod \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.323946 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-run-httpd\") pod \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.323989 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-combined-ca-bundle\") pod \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.324104 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-ceilometer-tls-certs\") pod \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\" (UID: \"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c\") " Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.325122 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" (UID: "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.325438 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" (UID: "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.330832 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-scripts" (OuterVolumeSpecName: "scripts") pod "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" (UID: "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.350362 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-kube-api-access-nl88g" (OuterVolumeSpecName: "kube-api-access-nl88g") pod "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" (UID: "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c"). InnerVolumeSpecName "kube-api-access-nl88g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.380744 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" (UID: "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.402981 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" (UID: "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.426210 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.426244 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.426253 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.426261 4903 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.426349 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl88g\" (UniqueName: \"kubernetes.io/projected/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-kube-api-access-nl88g\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.426358 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.437371 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" (UID: "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.451321 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-config-data" (OuterVolumeSpecName: "config-data") pod "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" (UID: "5dc309ae-dfeb-4e11-8e1d-38092f5cd35c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.528336 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.528366 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.542598 4903 generic.go:334] "Generic (PLEG): container finished" podID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerID="8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093" exitCode=0 Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.542648 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c","Type":"ContainerDied","Data":"8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093"} Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.542676 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dc309ae-dfeb-4e11-8e1d-38092f5cd35c","Type":"ContainerDied","Data":"02a9344e9860a1ce34a64ddfe3c7e071338227bc413e35735d011e50b3a59398"} Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.542696 4903 scope.go:117] "RemoveContainer" containerID="4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.542826 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.579531 4903 scope.go:117] "RemoveContainer" containerID="296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.585142 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.609828 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.618420 4903 scope.go:117] "RemoveContainer" containerID="a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.663430 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:44 crc kubenswrapper[4903]: E0320 08:48:44.663985 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="proxy-httpd" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.663999 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="proxy-httpd" Mar 20 08:48:44 crc kubenswrapper[4903]: E0320 08:48:44.664013 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3610fd9b-fd0a-4b22-8a26-27a393bf92a6" containerName="init" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.664021 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3610fd9b-fd0a-4b22-8a26-27a393bf92a6" containerName="init" Mar 20 08:48:44 crc kubenswrapper[4903]: E0320 08:48:44.664090 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="ceilometer-central-agent" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.664100 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="ceilometer-central-agent" Mar 20 08:48:44 crc kubenswrapper[4903]: E0320 08:48:44.664107 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3610fd9b-fd0a-4b22-8a26-27a393bf92a6" containerName="dnsmasq-dns" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.664113 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3610fd9b-fd0a-4b22-8a26-27a393bf92a6" containerName="dnsmasq-dns" Mar 20 08:48:44 crc kubenswrapper[4903]: E0320 08:48:44.664128 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="ceilometer-notification-agent" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.664134 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="ceilometer-notification-agent" Mar 20 08:48:44 crc kubenswrapper[4903]: E0320 08:48:44.664157 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="sg-core" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.664164 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="sg-core" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.664319 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3610fd9b-fd0a-4b22-8a26-27a393bf92a6" containerName="dnsmasq-dns" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.664328 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="ceilometer-central-agent" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.664349 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="sg-core" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.664367 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="proxy-httpd" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.664373 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" containerName="ceilometer-notification-agent" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.666174 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.670842 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.671107 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.671278 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.678298 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.695419 4903 scope.go:117] "RemoveContainer" containerID="8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.733786 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.733839 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.733923 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-config-data\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.733976 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcvkq\" (UniqueName: \"kubernetes.io/projected/8005d467-6a20-4e68-b62f-65ad97a31812-kube-api-access-hcvkq\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.733994 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.734048 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-scripts\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.734074 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-run-httpd\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.734109 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-log-httpd\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.753518 4903 scope.go:117] "RemoveContainer" containerID="4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa" Mar 20 08:48:44 crc kubenswrapper[4903]: E0320 08:48:44.754084 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa\": container with ID starting with 4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa not found: ID does not exist" containerID="4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.754131 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa"} err="failed to get container status \"4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa\": rpc error: code = NotFound desc = could not find container \"4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa\": container with ID starting with 4a3c0d168af539f1ec3ad9ba0e7a7d0d885c1210cf98593d29c5fe99817b5ffa not found: ID does not exist" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.754159 4903 scope.go:117] "RemoveContainer" containerID="296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9" Mar 20 08:48:44 crc kubenswrapper[4903]: E0320 08:48:44.754632 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9\": container with ID starting with 296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9 not found: ID does not exist" containerID="296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.754683 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9"} err="failed to get container status \"296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9\": rpc error: code = NotFound desc = could not find container \"296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9\": container with ID starting with 296df4348043f5cb09ea97b3475928761a16392205a12c43adb7e0b1d1f1b1d9 not found: ID does not exist" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.754713 4903 scope.go:117] "RemoveContainer" containerID="a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368" Mar 20 08:48:44 crc kubenswrapper[4903]: E0320 08:48:44.754972 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368\": container with ID starting with a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368 not found: ID does not exist" containerID="a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.755008 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368"} err="failed to get container status \"a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368\": rpc error: code = NotFound desc = could not find container \"a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368\": container with ID starting with a3fe057c89e347539ddf6a16a72d532c998316a2be1ef484ab5352b9e6859368 not found: ID does not exist" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.755073 4903 scope.go:117] "RemoveContainer" containerID="8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093" Mar 20 08:48:44 crc kubenswrapper[4903]: E0320 08:48:44.755458 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093\": container with ID starting with 8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093 not found: ID does not exist" containerID="8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.755502 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093"} err="failed to get container status \"8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093\": rpc error: code = NotFound desc = could not find container \"8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093\": container with ID starting with 8ac4f76d6846d4293fc999559ae60157042bdc805dde913412c12c9f0a13e093 not found: ID does not exist" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.835008 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.835144 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-config-data\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.835992 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcvkq\" (UniqueName: \"kubernetes.io/projected/8005d467-6a20-4e68-b62f-65ad97a31812-kube-api-access-hcvkq\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.836052 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.836084 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-scripts\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.836110 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-run-httpd\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.836418 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-log-httpd\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.836446 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.836784 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-log-httpd\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.836921 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-run-httpd\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.839534 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-config-data\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.840008 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.840229 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-scripts\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.843278 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.844030 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:44 crc kubenswrapper[4903]: I0320 08:48:44.854183 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcvkq\" (UniqueName: \"kubernetes.io/projected/8005d467-6a20-4e68-b62f-65ad97a31812-kube-api-access-hcvkq\") pod \"ceilometer-0\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " pod="openstack/ceilometer-0" Mar 20 08:48:45 crc kubenswrapper[4903]: I0320 08:48:45.018373 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:48:45 crc kubenswrapper[4903]: I0320 08:48:45.514861 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc309ae-dfeb-4e11-8e1d-38092f5cd35c" path="/var/lib/kubelet/pods/5dc309ae-dfeb-4e11-8e1d-38092f5cd35c/volumes" Mar 20 08:48:45 crc kubenswrapper[4903]: I0320 08:48:45.559125 4903 generic.go:334] "Generic (PLEG): container finished" podID="7c102604-23bc-49f8-96ce-821603a4f4bf" containerID="59e15fc31b0ce96747c05d477541c7c6a3fe03d93487ef381bf0b660eb845438" exitCode=0 Mar 20 08:48:45 crc kubenswrapper[4903]: I0320 08:48:45.559238 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bff6c" event={"ID":"7c102604-23bc-49f8-96ce-821603a4f4bf","Type":"ContainerDied","Data":"59e15fc31b0ce96747c05d477541c7c6a3fe03d93487ef381bf0b660eb845438"} Mar 20 08:48:45 crc kubenswrapper[4903]: I0320 08:48:45.563077 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:48:45 crc kubenswrapper[4903]: I0320 08:48:45.993949 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:48:45 crc kubenswrapper[4903]: I0320 08:48:45.994238 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:48:46 crc kubenswrapper[4903]: I0320 08:48:46.574248 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8005d467-6a20-4e68-b62f-65ad97a31812","Type":"ContainerStarted","Data":"a8e86b614ac7f5338fb12a670fd38985e9a2c2f7e4f4a111a46686a3317b5790"} Mar 20 08:48:46 crc kubenswrapper[4903]: I0320 08:48:46.574623 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8005d467-6a20-4e68-b62f-65ad97a31812","Type":"ContainerStarted","Data":"8fff4d5535d1d940f4f4c451c78082ae32c553a62eb876a001727b4ad53184ed"} Mar 20 08:48:46 crc kubenswrapper[4903]: I0320 08:48:46.956431 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:46 crc kubenswrapper[4903]: I0320 08:48:46.994659 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-config-data\") pod \"7c102604-23bc-49f8-96ce-821603a4f4bf\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " Mar 20 08:48:46 crc kubenswrapper[4903]: I0320 08:48:46.994756 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgl7w\" (UniqueName: \"kubernetes.io/projected/7c102604-23bc-49f8-96ce-821603a4f4bf-kube-api-access-hgl7w\") pod \"7c102604-23bc-49f8-96ce-821603a4f4bf\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " Mar 20 08:48:46 crc kubenswrapper[4903]: I0320 08:48:46.994853 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-combined-ca-bundle\") pod \"7c102604-23bc-49f8-96ce-821603a4f4bf\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " Mar 20 08:48:46 crc kubenswrapper[4903]: I0320 08:48:46.995096 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-scripts\") pod \"7c102604-23bc-49f8-96ce-821603a4f4bf\" (UID: \"7c102604-23bc-49f8-96ce-821603a4f4bf\") " Mar 20 08:48:46 crc kubenswrapper[4903]: I0320 08:48:46.999650 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c102604-23bc-49f8-96ce-821603a4f4bf-kube-api-access-hgl7w" (OuterVolumeSpecName: "kube-api-access-hgl7w") pod "7c102604-23bc-49f8-96ce-821603a4f4bf" (UID: "7c102604-23bc-49f8-96ce-821603a4f4bf"). InnerVolumeSpecName "kube-api-access-hgl7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.001378 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-scripts" (OuterVolumeSpecName: "scripts") pod "7c102604-23bc-49f8-96ce-821603a4f4bf" (UID: "7c102604-23bc-49f8-96ce-821603a4f4bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.048985 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c102604-23bc-49f8-96ce-821603a4f4bf" (UID: "7c102604-23bc-49f8-96ce-821603a4f4bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.050229 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-config-data" (OuterVolumeSpecName: "config-data") pod "7c102604-23bc-49f8-96ce-821603a4f4bf" (UID: "7c102604-23bc-49f8-96ce-821603a4f4bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.097426 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.097464 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.097479 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgl7w\" (UniqueName: \"kubernetes.io/projected/7c102604-23bc-49f8-96ce-821603a4f4bf-kube-api-access-hgl7w\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.097494 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c102604-23bc-49f8-96ce-821603a4f4bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.590408 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8005d467-6a20-4e68-b62f-65ad97a31812","Type":"ContainerStarted","Data":"d273844594da563a28c607f77733e2141f72daa05b9f296500d2582410c90a0f"} Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.611515 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bff6c" event={"ID":"7c102604-23bc-49f8-96ce-821603a4f4bf","Type":"ContainerDied","Data":"56b7d7caa9dc3d25393038bf63e820841e38c5caf018ff24fc5d062541e7ad4c"} Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.611559 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56b7d7caa9dc3d25393038bf63e820841e38c5caf018ff24fc5d062541e7ad4c" Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.611618 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bff6c" Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.809114 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.809484 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dcf64dcf-b799-4f37-84e1-8c43e1737259" containerName="nova-api-log" containerID="cri-o://19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869" gracePeriod=30 Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.810014 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dcf64dcf-b799-4f37-84e1-8c43e1737259" containerName="nova-api-api" containerID="cri-o://55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11" gracePeriod=30 Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.834803 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.835019 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="700d5965-30b6-4fad-b808-f7a4ed433b9b" containerName="nova-scheduler-scheduler" containerID="cri-o://2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69" gracePeriod=30 Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.868943 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.869198 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerName="nova-metadata-log" containerID="cri-o://893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280" gracePeriod=30 Mar 20 08:48:47 crc kubenswrapper[4903]: I0320 08:48:47.869597 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerName="nova-metadata-metadata" containerID="cri-o://14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854" gracePeriod=30 Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.484009 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.530052 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-internal-tls-certs\") pod \"dcf64dcf-b799-4f37-84e1-8c43e1737259\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.530121 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf64dcf-b799-4f37-84e1-8c43e1737259-logs\") pod \"dcf64dcf-b799-4f37-84e1-8c43e1737259\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.530158 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54ljh\" (UniqueName: \"kubernetes.io/projected/dcf64dcf-b799-4f37-84e1-8c43e1737259-kube-api-access-54ljh\") pod \"dcf64dcf-b799-4f37-84e1-8c43e1737259\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.530244 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-combined-ca-bundle\") pod \"dcf64dcf-b799-4f37-84e1-8c43e1737259\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.530268 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-public-tls-certs\") pod \"dcf64dcf-b799-4f37-84e1-8c43e1737259\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.530300 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-config-data\") pod \"dcf64dcf-b799-4f37-84e1-8c43e1737259\" (UID: \"dcf64dcf-b799-4f37-84e1-8c43e1737259\") " Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.530494 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcf64dcf-b799-4f37-84e1-8c43e1737259-logs" (OuterVolumeSpecName: "logs") pod "dcf64dcf-b799-4f37-84e1-8c43e1737259" (UID: "dcf64dcf-b799-4f37-84e1-8c43e1737259"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.530572 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcf64dcf-b799-4f37-84e1-8c43e1737259-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.547413 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf64dcf-b799-4f37-84e1-8c43e1737259-kube-api-access-54ljh" (OuterVolumeSpecName: "kube-api-access-54ljh") pod "dcf64dcf-b799-4f37-84e1-8c43e1737259" (UID: "dcf64dcf-b799-4f37-84e1-8c43e1737259"). InnerVolumeSpecName "kube-api-access-54ljh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.572135 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcf64dcf-b799-4f37-84e1-8c43e1737259" (UID: "dcf64dcf-b799-4f37-84e1-8c43e1737259"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.580691 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-config-data" (OuterVolumeSpecName: "config-data") pod "dcf64dcf-b799-4f37-84e1-8c43e1737259" (UID: "dcf64dcf-b799-4f37-84e1-8c43e1737259"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.629027 4903 generic.go:334] "Generic (PLEG): container finished" podID="dcf64dcf-b799-4f37-84e1-8c43e1737259" containerID="55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11" exitCode=0 Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.629081 4903 generic.go:334] "Generic (PLEG): container finished" podID="dcf64dcf-b799-4f37-84e1-8c43e1737259" containerID="19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869" exitCode=143 Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.629111 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.630246 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcf64dcf-b799-4f37-84e1-8c43e1737259","Type":"ContainerDied","Data":"55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11"} Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.630389 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcf64dcf-b799-4f37-84e1-8c43e1737259","Type":"ContainerDied","Data":"19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869"} Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.630483 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dcf64dcf-b799-4f37-84e1-8c43e1737259","Type":"ContainerDied","Data":"4a20856b6fa5d9f3a5521b4a0695bc759e2b080d098f1425e7febe689fdc7aea"} Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.630446 4903 scope.go:117] "RemoveContainer" containerID="55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.633604 4903 generic.go:334] "Generic (PLEG): container finished" podID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerID="893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280" exitCode=143 Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.633678 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ab04638-79a1-46f3-9e67-ba51ee0d12f7","Type":"ContainerDied","Data":"893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280"} Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.634593 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dcf64dcf-b799-4f37-84e1-8c43e1737259" (UID: "dcf64dcf-b799-4f37-84e1-8c43e1737259"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.635778 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.636069 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.636413 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54ljh\" (UniqueName: \"kubernetes.io/projected/dcf64dcf-b799-4f37-84e1-8c43e1737259-kube-api-access-54ljh\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.636492 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.637625 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8005d467-6a20-4e68-b62f-65ad97a31812","Type":"ContainerStarted","Data":"3eb344be279e7505b3a1ee366d11602b2711ebf94ecb9c7d600238fd0ec65c58"} Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.659506 4903 scope.go:117] "RemoveContainer" containerID="19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.661804 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dcf64dcf-b799-4f37-84e1-8c43e1737259" (UID: "dcf64dcf-b799-4f37-84e1-8c43e1737259"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.682468 4903 scope.go:117] "RemoveContainer" containerID="55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11" Mar 20 08:48:48 crc kubenswrapper[4903]: E0320 08:48:48.683002 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11\": container with ID starting with 55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11 not found: ID does not exist" containerID="55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.683058 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11"} err="failed to get container status \"55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11\": rpc error: code = NotFound desc = could not find container \"55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11\": container with ID starting with 55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11 not found: ID does not exist" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.683085 4903 scope.go:117] "RemoveContainer" containerID="19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869" Mar 20 08:48:48 crc kubenswrapper[4903]: E0320 08:48:48.684133 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869\": container with ID starting with 19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869 not found: ID does not exist" containerID="19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.684166 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869"} err="failed to get container status \"19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869\": rpc error: code = NotFound desc = could not find container \"19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869\": container with ID starting with 19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869 not found: ID does not exist" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.684181 4903 scope.go:117] "RemoveContainer" containerID="55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.685025 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11"} err="failed to get container status \"55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11\": rpc error: code = NotFound desc = could not find container \"55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11\": container with ID starting with 55bc30453b83133c12c9be27eb1e0f973d4aab338bcc76a4bf676a23e6ba8c11 not found: ID does not exist" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.685064 4903 scope.go:117] "RemoveContainer" containerID="19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.685633 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869"} err="failed to get container status \"19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869\": rpc error: code = NotFound desc = could not find container \"19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869\": container with ID starting with 19c1f8d6b511d157c4a1e0834a7519d075b046b3695440c2e89a70e69d8aa869 not found: ID does not exist" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.738337 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dcf64dcf-b799-4f37-84e1-8c43e1737259-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.973194 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:48 crc kubenswrapper[4903]: I0320 08:48:48.989903 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.011473 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:49 crc kubenswrapper[4903]: E0320 08:48:49.011999 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf64dcf-b799-4f37-84e1-8c43e1737259" containerName="nova-api-log" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.012014 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf64dcf-b799-4f37-84e1-8c43e1737259" containerName="nova-api-log" Mar 20 08:48:49 crc kubenswrapper[4903]: E0320 08:48:49.012022 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c102604-23bc-49f8-96ce-821603a4f4bf" containerName="nova-manage" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.012031 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c102604-23bc-49f8-96ce-821603a4f4bf" containerName="nova-manage" Mar 20 08:48:49 crc kubenswrapper[4903]: E0320 08:48:49.012061 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf64dcf-b799-4f37-84e1-8c43e1737259" containerName="nova-api-api" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.012069 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf64dcf-b799-4f37-84e1-8c43e1737259" containerName="nova-api-api" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.012267 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c102604-23bc-49f8-96ce-821603a4f4bf" containerName="nova-manage" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.012276 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf64dcf-b799-4f37-84e1-8c43e1737259" containerName="nova-api-log" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.012292 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf64dcf-b799-4f37-84e1-8c43e1737259" containerName="nova-api-api" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.013436 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.018193 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.019453 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.026121 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.046083 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-internal-tls-certs\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.046554 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-842f2\" (UniqueName: \"kubernetes.io/projected/877f943b-808c-435e-a5cf-bda8ea0a5d15-kube-api-access-842f2\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.046718 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877f943b-808c-435e-a5cf-bda8ea0a5d15-logs\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.048546 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-config-data\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.048688 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-public-tls-certs\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.048915 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.076670 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.154181 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877f943b-808c-435e-a5cf-bda8ea0a5d15-logs\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.154241 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-config-data\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.154272 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-public-tls-certs\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.154348 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.154383 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-internal-tls-certs\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.154429 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-842f2\" (UniqueName: \"kubernetes.io/projected/877f943b-808c-435e-a5cf-bda8ea0a5d15-kube-api-access-842f2\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.154694 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877f943b-808c-435e-a5cf-bda8ea0a5d15-logs\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.158812 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-public-tls-certs\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.163267 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.163540 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-internal-tls-certs\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.164595 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-config-data\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.171067 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-842f2\" (UniqueName: \"kubernetes.io/projected/877f943b-808c-435e-a5cf-bda8ea0a5d15-kube-api-access-842f2\") pod \"nova-api-0\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.333548 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:48:49 crc kubenswrapper[4903]: E0320 08:48:49.371777 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 08:48:49 crc kubenswrapper[4903]: E0320 08:48:49.373076 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 08:48:49 crc kubenswrapper[4903]: E0320 08:48:49.374865 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 08:48:49 crc kubenswrapper[4903]: E0320 08:48:49.374911 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="700d5965-30b6-4fad-b808-f7a4ed433b9b" containerName="nova-scheduler-scheduler" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.500535 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf64dcf-b799-4f37-84e1-8c43e1737259" path="/var/lib/kubelet/pods/dcf64dcf-b799-4f37-84e1-8c43e1737259/volumes" Mar 20 08:48:49 crc kubenswrapper[4903]: I0320 08:48:49.879305 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:48:50 crc kubenswrapper[4903]: I0320 08:48:50.662407 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"877f943b-808c-435e-a5cf-bda8ea0a5d15","Type":"ContainerStarted","Data":"fe164d29a102e1f393cadf6ed7a51abeefbf7ba5309bfd408ab8c2520f8f2829"} Mar 20 08:48:50 crc kubenswrapper[4903]: I0320 08:48:50.662800 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"877f943b-808c-435e-a5cf-bda8ea0a5d15","Type":"ContainerStarted","Data":"a2260ceafa26704cedbc21056c132c79bd5a3239f4a5a39d01b35eaa39143f86"} Mar 20 08:48:50 crc kubenswrapper[4903]: I0320 08:48:50.662843 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"877f943b-808c-435e-a5cf-bda8ea0a5d15","Type":"ContainerStarted","Data":"7df4dbd4badba2ed9e91ed3d3e81edb212a9e3001550c6c9a49714877d7a7017"} Mar 20 08:48:50 crc kubenswrapper[4903]: I0320 08:48:50.666482 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8005d467-6a20-4e68-b62f-65ad97a31812","Type":"ContainerStarted","Data":"fe129b49b118ce4ae8579680e5a1ff682ff67dcbd9c4f54a13ce05fb9eced8cd"} Mar 20 08:48:50 crc kubenswrapper[4903]: I0320 08:48:50.666707 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 08:48:50 crc kubenswrapper[4903]: I0320 08:48:50.710921 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.710890661 podStartE2EDuration="2.710890661s" podCreationTimestamp="2026-03-20 08:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:50.695969845 +0000 UTC m=+1555.912870180" watchObservedRunningTime="2026-03-20 08:48:50.710890661 +0000 UTC m=+1555.927790976" Mar 20 08:48:50 crc kubenswrapper[4903]: I0320 08:48:50.833809 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:48:50 crc kubenswrapper[4903]: I0320 08:48:50.833895 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.538964 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.565244 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.518763515 podStartE2EDuration="7.560818967s" podCreationTimestamp="2026-03-20 08:48:44 +0000 UTC" firstStartedPulling="2026-03-20 08:48:45.565631134 +0000 UTC m=+1550.782531459" lastFinishedPulling="2026-03-20 08:48:49.607686596 +0000 UTC m=+1554.824586911" observedRunningTime="2026-03-20 08:48:50.731764073 +0000 UTC m=+1555.948664388" watchObservedRunningTime="2026-03-20 08:48:51.560818967 +0000 UTC m=+1556.777719362" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.624924 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttzs8\" (UniqueName: \"kubernetes.io/projected/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-kube-api-access-ttzs8\") pod \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.624987 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-logs\") pod \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.625098 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-combined-ca-bundle\") pod \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.625137 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-config-data\") pod \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.625241 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-nova-metadata-tls-certs\") pod \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\" (UID: \"1ab04638-79a1-46f3-9e67-ba51ee0d12f7\") " Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.626676 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-logs" (OuterVolumeSpecName: "logs") pod "1ab04638-79a1-46f3-9e67-ba51ee0d12f7" (UID: "1ab04638-79a1-46f3-9e67-ba51ee0d12f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.636739 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-kube-api-access-ttzs8" (OuterVolumeSpecName: "kube-api-access-ttzs8") pod "1ab04638-79a1-46f3-9e67-ba51ee0d12f7" (UID: "1ab04638-79a1-46f3-9e67-ba51ee0d12f7"). InnerVolumeSpecName "kube-api-access-ttzs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.674716 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1ab04638-79a1-46f3-9e67-ba51ee0d12f7" (UID: "1ab04638-79a1-46f3-9e67-ba51ee0d12f7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.678218 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-config-data" (OuterVolumeSpecName: "config-data") pod "1ab04638-79a1-46f3-9e67-ba51ee0d12f7" (UID: "1ab04638-79a1-46f3-9e67-ba51ee0d12f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.682230 4903 generic.go:334] "Generic (PLEG): container finished" podID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerID="14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854" exitCode=0 Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.682331 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.682393 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ab04638-79a1-46f3-9e67-ba51ee0d12f7","Type":"ContainerDied","Data":"14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854"} Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.682459 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ab04638-79a1-46f3-9e67-ba51ee0d12f7","Type":"ContainerDied","Data":"da6942258e77637fc06615f76976803acb4a736bea98ee870e8a545e2e5cc449"} Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.682491 4903 scope.go:117] "RemoveContainer" containerID="14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.695314 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ab04638-79a1-46f3-9e67-ba51ee0d12f7" (UID: "1ab04638-79a1-46f3-9e67-ba51ee0d12f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.727407 4903 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.727445 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttzs8\" (UniqueName: \"kubernetes.io/projected/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-kube-api-access-ttzs8\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.727466 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.727484 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.727500 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ab04638-79a1-46f3-9e67-ba51ee0d12f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.754690 4903 scope.go:117] "RemoveContainer" containerID="893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.772657 4903 scope.go:117] "RemoveContainer" containerID="14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854" Mar 20 08:48:51 crc kubenswrapper[4903]: E0320 08:48:51.773077 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854\": container with ID starting with 14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854 not found: ID does not exist" containerID="14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.773186 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854"} err="failed to get container status \"14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854\": rpc error: code = NotFound desc = could not find container \"14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854\": container with ID starting with 14949e33b95ecdd2c2e91d7b0e4b10764e454f61d7ee8c8c552168f6a689d854 not found: ID does not exist" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.773264 4903 scope.go:117] "RemoveContainer" containerID="893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280" Mar 20 08:48:51 crc kubenswrapper[4903]: E0320 08:48:51.773632 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280\": container with ID starting with 893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280 not found: ID does not exist" containerID="893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280" Mar 20 08:48:51 crc kubenswrapper[4903]: I0320 08:48:51.773673 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280"} err="failed to get container status \"893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280\": rpc error: code = NotFound desc = could not find container \"893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280\": container with ID starting with 893f4f037f00b8782f44d856f0e857221584b18084b413a1e6ec553fd201d280 not found: ID does not exist" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.053140 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.067433 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.089795 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:52 crc kubenswrapper[4903]: E0320 08:48:52.091254 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerName="nova-metadata-metadata" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.091290 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerName="nova-metadata-metadata" Mar 20 08:48:52 crc kubenswrapper[4903]: E0320 08:48:52.091343 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerName="nova-metadata-log" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.091351 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerName="nova-metadata-log" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.093427 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerName="nova-metadata-metadata" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.093613 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" containerName="nova-metadata-log" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.096074 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.102274 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.102837 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.118655 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.245083 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x7wx\" (UniqueName: \"kubernetes.io/projected/7f5f160c-29e2-43d0-bb55-6969904b3a4e-kube-api-access-6x7wx\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.245140 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-config-data\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.245169 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.245198 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f5f160c-29e2-43d0-bb55-6969904b3a4e-logs\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.245291 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.347624 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.347990 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f5f160c-29e2-43d0-bb55-6969904b3a4e-logs\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.348055 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.348597 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f5f160c-29e2-43d0-bb55-6969904b3a4e-logs\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.348867 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x7wx\" (UniqueName: \"kubernetes.io/projected/7f5f160c-29e2-43d0-bb55-6969904b3a4e-kube-api-access-6x7wx\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.348971 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-config-data\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.352875 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.355591 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-config-data\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.374448 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.376008 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x7wx\" (UniqueName: \"kubernetes.io/projected/7f5f160c-29e2-43d0-bb55-6969904b3a4e-kube-api-access-6x7wx\") pod \"nova-metadata-0\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.472928 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:48:52 crc kubenswrapper[4903]: I0320 08:48:52.991507 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:48:52 crc kubenswrapper[4903]: W0320 08:48:52.991736 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f5f160c_29e2_43d0_bb55_6969904b3a4e.slice/crio-01038cff4a0bccaa415590843a6df02825138f5efbcb14fc952063ab9e9f4fe6 WatchSource:0}: Error finding container 01038cff4a0bccaa415590843a6df02825138f5efbcb14fc952063ab9e9f4fe6: Status 404 returned error can't find the container with id 01038cff4a0bccaa415590843a6df02825138f5efbcb14fc952063ab9e9f4fe6 Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.495008 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.506336 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab04638-79a1-46f3-9e67-ba51ee0d12f7" path="/var/lib/kubelet/pods/1ab04638-79a1-46f3-9e67-ba51ee0d12f7/volumes" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.580578 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-combined-ca-bundle\") pod \"700d5965-30b6-4fad-b808-f7a4ed433b9b\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.580910 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-config-data\") pod \"700d5965-30b6-4fad-b808-f7a4ed433b9b\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.581088 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnl4s\" (UniqueName: \"kubernetes.io/projected/700d5965-30b6-4fad-b808-f7a4ed433b9b-kube-api-access-fnl4s\") pod \"700d5965-30b6-4fad-b808-f7a4ed433b9b\" (UID: \"700d5965-30b6-4fad-b808-f7a4ed433b9b\") " Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.586663 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700d5965-30b6-4fad-b808-f7a4ed433b9b-kube-api-access-fnl4s" (OuterVolumeSpecName: "kube-api-access-fnl4s") pod "700d5965-30b6-4fad-b808-f7a4ed433b9b" (UID: "700d5965-30b6-4fad-b808-f7a4ed433b9b"). InnerVolumeSpecName "kube-api-access-fnl4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.614734 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "700d5965-30b6-4fad-b808-f7a4ed433b9b" (UID: "700d5965-30b6-4fad-b808-f7a4ed433b9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.616890 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-config-data" (OuterVolumeSpecName: "config-data") pod "700d5965-30b6-4fad-b808-f7a4ed433b9b" (UID: "700d5965-30b6-4fad-b808-f7a4ed433b9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.683449 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.683486 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnl4s\" (UniqueName: \"kubernetes.io/projected/700d5965-30b6-4fad-b808-f7a4ed433b9b-kube-api-access-fnl4s\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.683500 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700d5965-30b6-4fad-b808-f7a4ed433b9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.727735 4903 generic.go:334] "Generic (PLEG): container finished" podID="700d5965-30b6-4fad-b808-f7a4ed433b9b" containerID="2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69" exitCode=0 Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.727840 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.727832 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"700d5965-30b6-4fad-b808-f7a4ed433b9b","Type":"ContainerDied","Data":"2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69"} Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.729078 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"700d5965-30b6-4fad-b808-f7a4ed433b9b","Type":"ContainerDied","Data":"7ffee58bae19f421994d8e8ea67b213e6fc2ddc1324a4d6fc04ad49a8ff92238"} Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.729109 4903 scope.go:117] "RemoveContainer" containerID="2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.732522 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f5f160c-29e2-43d0-bb55-6969904b3a4e","Type":"ContainerStarted","Data":"625122aba923f3f4672abe1d898433060719b38b2ba824aec7bed77ddaa609eb"} Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.732620 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f5f160c-29e2-43d0-bb55-6969904b3a4e","Type":"ContainerStarted","Data":"90c27df17f81d979977526ee2d95fcef8b351eceda695edad6f1dfd00a676827"} Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.732657 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f5f160c-29e2-43d0-bb55-6969904b3a4e","Type":"ContainerStarted","Data":"01038cff4a0bccaa415590843a6df02825138f5efbcb14fc952063ab9e9f4fe6"} Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.777625 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.777590862 podStartE2EDuration="1.777590862s" podCreationTimestamp="2026-03-20 08:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:53.757625983 +0000 UTC m=+1558.974526298" watchObservedRunningTime="2026-03-20 08:48:53.777590862 +0000 UTC m=+1558.994491177" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.778108 4903 scope.go:117] "RemoveContainer" containerID="2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69" Mar 20 08:48:53 crc kubenswrapper[4903]: E0320 08:48:53.778610 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69\": container with ID starting with 2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69 not found: ID does not exist" containerID="2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.778676 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69"} err="failed to get container status \"2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69\": rpc error: code = NotFound desc = could not find container \"2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69\": container with ID starting with 2f93dc3100cd795c4bd2b908b58ffbe7ef35750a55a2c66ca4362ceb9135ce69 not found: ID does not exist" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.798723 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.810866 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.823285 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:53 crc kubenswrapper[4903]: E0320 08:48:53.823905 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700d5965-30b6-4fad-b808-f7a4ed433b9b" containerName="nova-scheduler-scheduler" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.823932 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="700d5965-30b6-4fad-b808-f7a4ed433b9b" containerName="nova-scheduler-scheduler" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.824281 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="700d5965-30b6-4fad-b808-f7a4ed433b9b" containerName="nova-scheduler-scheduler" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.825180 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.836556 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.838619 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.887645 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-config-data\") pod \"nova-scheduler-0\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.887809 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.887872 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fdft\" (UniqueName: \"kubernetes.io/projected/dc127483-5a42-4eea-8b8c-8a1382dced05-kube-api-access-2fdft\") pod \"nova-scheduler-0\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.990024 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fdft\" (UniqueName: \"kubernetes.io/projected/dc127483-5a42-4eea-8b8c-8a1382dced05-kube-api-access-2fdft\") pod \"nova-scheduler-0\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.990875 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-config-data\") pod \"nova-scheduler-0\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.991745 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.994951 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-config-data\") pod \"nova-scheduler-0\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:53 crc kubenswrapper[4903]: I0320 08:48:53.995983 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:54 crc kubenswrapper[4903]: I0320 08:48:54.006482 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fdft\" (UniqueName: \"kubernetes.io/projected/dc127483-5a42-4eea-8b8c-8a1382dced05-kube-api-access-2fdft\") pod \"nova-scheduler-0\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " pod="openstack/nova-scheduler-0" Mar 20 08:48:54 crc kubenswrapper[4903]: I0320 08:48:54.158466 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:48:54 crc kubenswrapper[4903]: I0320 08:48:54.683226 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:48:54 crc kubenswrapper[4903]: I0320 08:48:54.759427 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc127483-5a42-4eea-8b8c-8a1382dced05","Type":"ContainerStarted","Data":"f20195f10697104fa9e427b643b441c1dac8cbabe31b792bac4d224fd500ede0"} Mar 20 08:48:55 crc kubenswrapper[4903]: I0320 08:48:55.510160 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700d5965-30b6-4fad-b808-f7a4ed433b9b" path="/var/lib/kubelet/pods/700d5965-30b6-4fad-b808-f7a4ed433b9b/volumes" Mar 20 08:48:55 crc kubenswrapper[4903]: I0320 08:48:55.772320 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc127483-5a42-4eea-8b8c-8a1382dced05","Type":"ContainerStarted","Data":"086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7"} Mar 20 08:48:55 crc kubenswrapper[4903]: I0320 08:48:55.800727 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.800705829 podStartE2EDuration="2.800705829s" podCreationTimestamp="2026-03-20 08:48:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 08:48:55.793747408 +0000 UTC m=+1561.010647723" watchObservedRunningTime="2026-03-20 08:48:55.800705829 +0000 UTC m=+1561.017606144" Mar 20 08:48:58 crc kubenswrapper[4903]: I0320 08:48:58.534958 4903 scope.go:117] "RemoveContainer" containerID="d9556cda66aa8de78ade2a6e0fddee8f9a0c9432b0aad91daf1068df0400aaf0" Mar 20 08:48:59 crc kubenswrapper[4903]: I0320 08:48:59.159498 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 08:48:59 crc kubenswrapper[4903]: I0320 08:48:59.334670 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:48:59 crc kubenswrapper[4903]: I0320 08:48:59.334754 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 08:49:00 crc kubenswrapper[4903]: I0320 08:49:00.354363 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:49:00 crc kubenswrapper[4903]: I0320 08:49:00.354383 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:49:02 crc kubenswrapper[4903]: I0320 08:49:02.474310 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 08:49:02 crc kubenswrapper[4903]: I0320 08:49:02.475652 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 08:49:03 crc kubenswrapper[4903]: I0320 08:49:03.493228 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:49:03 crc kubenswrapper[4903]: I0320 08:49:03.493216 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 08:49:04 crc kubenswrapper[4903]: I0320 08:49:04.159076 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 08:49:04 crc kubenswrapper[4903]: I0320 08:49:04.202175 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 08:49:04 crc kubenswrapper[4903]: I0320 08:49:04.933585 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 08:49:07 crc kubenswrapper[4903]: I0320 08:49:07.334379 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:49:07 crc kubenswrapper[4903]: I0320 08:49:07.334775 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 08:49:09 crc kubenswrapper[4903]: I0320 08:49:09.347056 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 08:49:09 crc kubenswrapper[4903]: I0320 08:49:09.349523 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 08:49:09 crc kubenswrapper[4903]: I0320 08:49:09.355448 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 08:49:09 crc kubenswrapper[4903]: I0320 08:49:09.967734 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 08:49:10 crc kubenswrapper[4903]: I0320 08:49:10.473954 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:49:10 crc kubenswrapper[4903]: I0320 08:49:10.474024 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 08:49:12 crc kubenswrapper[4903]: I0320 08:49:12.481280 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 08:49:12 crc kubenswrapper[4903]: I0320 08:49:12.482559 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 08:49:12 crc kubenswrapper[4903]: I0320 08:49:12.487430 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 08:49:13 crc kubenswrapper[4903]: I0320 08:49:13.006519 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 08:49:15 crc kubenswrapper[4903]: I0320 08:49:15.038814 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 08:49:20 crc kubenswrapper[4903]: I0320 08:49:20.833746 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:49:20 crc kubenswrapper[4903]: I0320 08:49:20.834740 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:49:20 crc kubenswrapper[4903]: I0320 08:49:20.834835 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:49:20 crc kubenswrapper[4903]: I0320 08:49:20.836199 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f3554d39d020685d0868b6a191ed76faf2b526f6e3fae809ae7af6a7a7f9269"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:49:20 crc kubenswrapper[4903]: I0320 08:49:20.836989 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://6f3554d39d020685d0868b6a191ed76faf2b526f6e3fae809ae7af6a7a7f9269" gracePeriod=600 Mar 20 08:49:21 crc kubenswrapper[4903]: I0320 08:49:21.094891 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="6f3554d39d020685d0868b6a191ed76faf2b526f6e3fae809ae7af6a7a7f9269" exitCode=0 Mar 20 08:49:21 crc kubenswrapper[4903]: I0320 08:49:21.094954 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"6f3554d39d020685d0868b6a191ed76faf2b526f6e3fae809ae7af6a7a7f9269"} Mar 20 08:49:21 crc kubenswrapper[4903]: I0320 08:49:21.095176 4903 scope.go:117] "RemoveContainer" containerID="7094e5f77c270dc626be780c469f09df2b6b5e5f309bca7fa5e8149bdd6f3199" Mar 20 08:49:22 crc kubenswrapper[4903]: I0320 08:49:22.112693 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9"} Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.341249 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-25f9-account-create-update-6rrdk"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.374167 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-25f9-account-create-update-6rrdk"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.406608 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.406894 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="4819c8dc-535a-4fb2-93ed-16eccdf8cd6c" containerName="openstackclient" containerID="cri-o://476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f" gracePeriod=2 Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.436101 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.507887 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16277119-789c-4d6e-8965-1ab0080f0871" path="/var/lib/kubelet/pods/16277119-789c-4d6e-8965-1ab0080f0871/volumes" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.536157 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-25f9-account-create-update-9n49h"] Mar 20 08:49:37 crc kubenswrapper[4903]: E0320 08:49:37.536638 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4819c8dc-535a-4fb2-93ed-16eccdf8cd6c" containerName="openstackclient" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.536657 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4819c8dc-535a-4fb2-93ed-16eccdf8cd6c" containerName="openstackclient" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.536854 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4819c8dc-535a-4fb2-93ed-16eccdf8cd6c" containerName="openstackclient" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.537523 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25f9-account-create-update-9n49h" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.540207 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.566109 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0a04-account-create-update-br8cg"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.585112 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0a04-account-create-update-br8cg"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.597724 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-25f9-account-create-update-9n49h"] Mar 20 08:49:37 crc kubenswrapper[4903]: E0320 08:49:37.615416 4903 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 08:49:37 crc kubenswrapper[4903]: E0320 08:49:37.615451 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: secret "swift-conf" not found Mar 20 08:49:37 crc kubenswrapper[4903]: E0320 08:49:37.615494 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:38.115475836 +0000 UTC m=+1603.332376151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : secret "swift-conf" not found Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.665130 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.666207 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerName="openstack-network-exporter" containerID="cri-o://a4feda7340d51a69a0ade15a8a02097cea235438a9973df73914b9990363b816" gracePeriod=300 Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.683803 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-87nxt"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.685048 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-87nxt" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.701992 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-87nxt"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.719427 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.721385 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc6a13a-9844-4e28-93e6-45025f1385a9-operator-scripts\") pod \"nova-api-25f9-account-create-update-9n49h\" (UID: \"2dc6a13a-9844-4e28-93e6-45025f1385a9\") " pod="openstack/nova-api-25f9-account-create-update-9n49h" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.721843 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts\") pod \"root-account-create-update-87nxt\" (UID: \"f8c76743-cd0d-48d8-940e-a5e750bd1fcc\") " pod="openstack/root-account-create-update-87nxt" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.722117 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75hxd\" (UniqueName: \"kubernetes.io/projected/2dc6a13a-9844-4e28-93e6-45025f1385a9-kube-api-access-75hxd\") pod \"nova-api-25f9-account-create-update-9n49h\" (UID: \"2dc6a13a-9844-4e28-93e6-45025f1385a9\") " pod="openstack/nova-api-25f9-account-create-update-9n49h" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.722217 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rxm7\" (UniqueName: \"kubernetes.io/projected/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-kube-api-access-6rxm7\") pod \"root-account-create-update-87nxt\" (UID: \"f8c76743-cd0d-48d8-940e-a5e750bd1fcc\") " pod="openstack/root-account-create-update-87nxt" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.751131 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e26e-account-create-update-clwzq"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.794886 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e26e-account-create-update-clwzq"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.824673 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75hxd\" (UniqueName: \"kubernetes.io/projected/2dc6a13a-9844-4e28-93e6-45025f1385a9-kube-api-access-75hxd\") pod \"nova-api-25f9-account-create-update-9n49h\" (UID: \"2dc6a13a-9844-4e28-93e6-45025f1385a9\") " pod="openstack/nova-api-25f9-account-create-update-9n49h" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.824724 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rxm7\" (UniqueName: \"kubernetes.io/projected/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-kube-api-access-6rxm7\") pod \"root-account-create-update-87nxt\" (UID: \"f8c76743-cd0d-48d8-940e-a5e750bd1fcc\") " pod="openstack/root-account-create-update-87nxt" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.824766 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc6a13a-9844-4e28-93e6-45025f1385a9-operator-scripts\") pod \"nova-api-25f9-account-create-update-9n49h\" (UID: \"2dc6a13a-9844-4e28-93e6-45025f1385a9\") " pod="openstack/nova-api-25f9-account-create-update-9n49h" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.824815 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts\") pod \"root-account-create-update-87nxt\" (UID: \"f8c76743-cd0d-48d8-940e-a5e750bd1fcc\") " pod="openstack/root-account-create-update-87nxt" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.835867 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc6a13a-9844-4e28-93e6-45025f1385a9-operator-scripts\") pod \"nova-api-25f9-account-create-update-9n49h\" (UID: \"2dc6a13a-9844-4e28-93e6-45025f1385a9\") " pod="openstack/nova-api-25f9-account-create-update-9n49h" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.840987 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts\") pod \"root-account-create-update-87nxt\" (UID: \"f8c76743-cd0d-48d8-940e-a5e750bd1fcc\") " pod="openstack/root-account-create-update-87nxt" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.887689 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.944410 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rxm7\" (UniqueName: \"kubernetes.io/projected/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-kube-api-access-6rxm7\") pod \"root-account-create-update-87nxt\" (UID: \"f8c76743-cd0d-48d8-940e-a5e750bd1fcc\") " pod="openstack/root-account-create-update-87nxt" Mar 20 08:49:37 crc kubenswrapper[4903]: I0320 08:49:37.945134 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75hxd\" (UniqueName: \"kubernetes.io/projected/2dc6a13a-9844-4e28-93e6-45025f1385a9-kube-api-access-75hxd\") pod \"nova-api-25f9-account-create-update-9n49h\" (UID: \"2dc6a13a-9844-4e28-93e6-45025f1385a9\") " pod="openstack/nova-api-25f9-account-create-update-9n49h" Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.062093 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerName="ovsdbserver-nb" containerID="cri-o://f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0" gracePeriod=300 Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.203017 4903 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.203409 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: secret "swift-conf" not found Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.203465 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:39.20344266 +0000 UTC m=+1604.420342975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : secret "swift-conf" not found Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.204231 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25f9-account-create-update-9n49h" Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.215474 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-87nxt" Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.219008 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-fb5f-account-create-update-dztfr"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.250706 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ngcnd"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.295680 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ngcnd"] Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.318251 4903 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.318318 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data podName:df937948-08c4-447c-9450-07221ce76552 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:38.818296625 +0000 UTC m=+1604.035196940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data") pod "rabbitmq-cell1-server-0" (UID: "df937948-08c4-447c-9450-07221ce76552") : configmap "rabbitmq-cell1-config-data" not found Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.322791 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-fb5f-account-create-update-dztfr"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.354397 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-f4rg2"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.396945 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-f4rg2"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.408905 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d2f7-account-create-update-9w2np"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.420580 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.421137 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" containerName="ovn-northd" containerID="cri-o://5704e6c6ea3db74005a8e1d1aeb869f6812338abc7af8b7e741fc45d5338477c" gracePeriod=30 Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.421593 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" containerName="openstack-network-exporter" containerID="cri-o://7125bc754a8c0e626fcd2fe281119d9040b278b420956ad513958b106967fd43" gracePeriod=30 Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.421669 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_548096cf-0b33-4f2f-b8be-7d1ac859cf7c/ovsdbserver-nb/0.log" Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.421711 4903 generic.go:334] "Generic (PLEG): container finished" podID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerID="a4feda7340d51a69a0ade15a8a02097cea235438a9973df73914b9990363b816" exitCode=2 Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.421731 4903 generic.go:334] "Generic (PLEG): container finished" podID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerID="f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0" exitCode=143 Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.421751 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"548096cf-0b33-4f2f-b8be-7d1ac859cf7c","Type":"ContainerDied","Data":"a4feda7340d51a69a0ade15a8a02097cea235438a9973df73914b9990363b816"} Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.421776 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"548096cf-0b33-4f2f-b8be-7d1ac859cf7c","Type":"ContainerDied","Data":"f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0"} Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.452372 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d2f7-account-create-update-9w2np"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.487883 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-08ee-account-create-update-s27bz"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.511321 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-08ee-account-create-update-s27bz"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.544518 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-72j48"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.569181 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-bff6c"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.584377 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-72j48"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.611598 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4e4e-account-create-update-5n652"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.624249 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-bff6c"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.672993 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4e4e-account-create-update-5n652"] Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.690326 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0 is running failed: container process not found" containerID="f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.692178 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0 is running failed: container process not found" containerID="f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.695964 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0 is running failed: container process not found" containerID="f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.696006 4903 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerName="ovsdbserver-nb" Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.709565 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xws82"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.765469 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xws82"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.788149 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-z5s8l"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.801384 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-z5s8l"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.809084 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-chrhv"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.849882 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wdtrn"] Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.852699 4903 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 20 08:49:38 crc kubenswrapper[4903]: E0320 08:49:38.852767 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data podName:df937948-08c4-447c-9450-07221ce76552 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:39.852751337 +0000 UTC m=+1605.069651652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data") pod "rabbitmq-cell1-server-0" (UID: "df937948-08c4-447c-9450-07221ce76552") : configmap "rabbitmq-cell1-config-data" not found Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.872274 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-ch9dc"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.872583 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-ch9dc" podUID="f390d60f-9967-4869-b09f-3cea4570186e" containerName="openstack-network-exporter" containerID="cri-o://16ea7c452104939a759a6fd64f9392015afdf41df82ecdb14f0a860eb6f425c4" gracePeriod=30 Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.890106 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ntkfl"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.898927 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ntkfl"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.905156 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-d8wd9"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.912444 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-d8wd9"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.925669 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.925887 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" containerName="cinder-scheduler" containerID="cri-o://cb7bb133830dc4bdac2cbea0a778366365297bc4c7bc623ae8334776f5496711" gracePeriod=30 Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.926447 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" containerName="probe" containerID="cri-o://4cb95c6e5180c8a94f8474533f104c8ca5edf99bbf830ed9f71c73b944a44ab1" gracePeriod=30 Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.929153 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9bgcj"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.929330 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" podUID="b286f9de-1973-4c7f-9350-4d3c31f9c1fb" containerName="dnsmasq-dns" containerID="cri-o://1c1a4fb611dcbe8e6e5c2845b3c1260632a46a79e55826154d5b3126dc0b4a4e" gracePeriod=10 Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.960675 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 08:49:38 crc kubenswrapper[4903]: I0320 08:49:38.961655 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="8639b665-721c-4dda-afe9-6e84f6f8a574" containerName="openstack-network-exporter" containerID="cri-o://2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8" gracePeriod=300 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.005318 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.005695 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerName="cinder-api-log" containerID="cri-o://bbc28588129fed5e832d9cf2c208bd4c746332410777ee79ad509494e640c235" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.006387 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerName="cinder-api" containerID="cri-o://dc39193e0b3efc7d58828ef8c691abe141cb7d978ac576bd15dc6776a511b5fe" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.020205 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.050017 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.050775 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-log" containerID="cri-o://a2260ceafa26704cedbc21056c132c79bd5a3239f4a5a39d01b35eaa39143f86" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.051279 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-api" containerID="cri-o://fe164d29a102e1f393cadf6ed7a51abeefbf7ba5309bfd408ab8c2520f8f2829" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.095762 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.211471 4903 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.211563 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data podName:888a3fd9-01f8-47b3-b1bb-f2b8b6b96509 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:39.711540674 +0000 UTC m=+1604.928440989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data") pod "rabbitmq-server-0" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509") : configmap "rabbitmq-config-data" not found Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.211855 4903 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.211870 4903 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.211888 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.211901 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.211933 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:41.211924703 +0000 UTC m=+1606.428825018 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.242789 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.243266 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" containerName="glance-log" containerID="cri-o://e7046c98c3546d6698b9ba6c5237b460ed8efe52b7e38c2e607164140f59d3d5" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.244283 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" containerName="glance-httpd" containerID="cri-o://036996cd6559515dd00dacbdb33836ee1417dee1983310445b9597aed681a5db" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.271383 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.271665 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerName="nova-metadata-log" containerID="cri-o://90c27df17f81d979977526ee2d95fcef8b351eceda695edad6f1dfd00a676827" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.272137 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerName="nova-metadata-metadata" containerID="cri-o://625122aba923f3f4672abe1d898433060719b38b2ba824aec7bed77ddaa609eb" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.302540 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="8639b665-721c-4dda-afe9-6e84f6f8a574" containerName="ovsdbserver-sb" containerID="cri-o://81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a" gracePeriod=300 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.319187 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5rbfs"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.538265 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ddb7780-fcec-42d1-811e-5cc8a4169917" path="/var/lib/kubelet/pods/0ddb7780-fcec-42d1-811e-5cc8a4169917/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.539136 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="292fda17-d4a4-4bee-ba75-d3221d870f63" path="/var/lib/kubelet/pods/292fda17-d4a4-4bee-ba75-d3221d870f63/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.539684 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68afaaf3-f790-4d52-9f29-49870d1950a5" path="/var/lib/kubelet/pods/68afaaf3-f790-4d52-9f29-49870d1950a5/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.540214 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e210f8c-e29d-442c-a5eb-ec6b639b0275" path="/var/lib/kubelet/pods/6e210f8c-e29d-442c-a5eb-ec6b639b0275/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.542185 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab050e7-f7f7-4a4a-ab49-b2601b269b48" path="/var/lib/kubelet/pods/7ab050e7-f7f7-4a4a-ab49-b2601b269b48/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.542688 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c102604-23bc-49f8-96ce-821603a4f4bf" path="/var/lib/kubelet/pods/7c102604-23bc-49f8-96ce-821603a4f4bf/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.543198 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83d25fdc-23fe-48a2-855e-5a907ad53d68" path="/var/lib/kubelet/pods/83d25fdc-23fe-48a2-855e-5a907ad53d68/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.544663 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8abfc827-bdd5-43e9-877c-c3d611fc463e" path="/var/lib/kubelet/pods/8abfc827-bdd5-43e9-877c-c3d611fc463e/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.545192 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975f9731-683e-4d5e-be2a-e1f824c38513" path="/var/lib/kubelet/pods/975f9731-683e-4d5e-be2a-e1f824c38513/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.545691 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be07fcc9-d6b5-4551-8846-94aa14b6af5d" path="/var/lib/kubelet/pods/be07fcc9-d6b5-4551-8846-94aa14b6af5d/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.546956 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cddb5fee-92f5-463f-a746-8a58e0a05e4b" path="/var/lib/kubelet/pods/cddb5fee-92f5-463f-a746-8a58e0a05e4b/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.547494 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd" path="/var/lib/kubelet/pods/cf56b91b-fa18-4fe3-bdb5-9c6c3fb530bd/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.547994 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03f215f-5dde-4e62-82c4-9ab840f6223a" path="/var/lib/kubelet/pods/e03f215f-5dde-4e62-82c4-9ab840f6223a/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.548685 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64f54bd-2813-41dd-86e6-9836da200d1c" path="/var/lib/kubelet/pods/e64f54bd-2813-41dd-86e6-9836da200d1c/volumes" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.549858 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5rbfs"] Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.557840 4903 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 08:49:39 crc kubenswrapper[4903]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: if [ -n "nova_api" ]; then Mar 20 08:49:39 crc kubenswrapper[4903]: GRANT_DATABASE="nova_api" Mar 20 08:49:39 crc kubenswrapper[4903]: else Mar 20 08:49:39 crc kubenswrapper[4903]: GRANT_DATABASE="*" Mar 20 08:49:39 crc kubenswrapper[4903]: fi Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: # going for maximum compatibility here: Mar 20 08:49:39 crc kubenswrapper[4903]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 08:49:39 crc kubenswrapper[4903]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 08:49:39 crc kubenswrapper[4903]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 08:49:39 crc kubenswrapper[4903]: # support updates Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: $MYSQL_CMD < logger="UnhandledError" Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.564832 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-25f9-account-create-update-9n49h" podUID="2dc6a13a-9844-4e28-93e6-45025f1385a9" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.610777 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-25k6h"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.613842 4903 generic.go:334] "Generic (PLEG): container finished" podID="b286f9de-1973-4c7f-9350-4d3c31f9c1fb" containerID="1c1a4fb611dcbe8e6e5c2845b3c1260632a46a79e55826154d5b3126dc0b4a4e" exitCode=0 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.613943 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" event={"ID":"b286f9de-1973-4c7f-9350-4d3c31f9c1fb","Type":"ContainerDied","Data":"1c1a4fb611dcbe8e6e5c2845b3c1260632a46a79e55826154d5b3126dc0b4a4e"} Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.641385 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-25k6h"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.646876 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ch9dc_f390d60f-9967-4869-b09f-3cea4570186e/openstack-network-exporter/0.log" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.646923 4903 generic.go:334] "Generic (PLEG): container finished" podID="f390d60f-9967-4869-b09f-3cea4570186e" containerID="16ea7c452104939a759a6fd64f9392015afdf41df82ecdb14f0a860eb6f425c4" exitCode=2 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.646982 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ch9dc" event={"ID":"f390d60f-9967-4869-b09f-3cea4570186e","Type":"ContainerDied","Data":"16ea7c452104939a759a6fd64f9392015afdf41df82ecdb14f0a860eb6f425c4"} Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.648548 4903 generic.go:334] "Generic (PLEG): container finished" podID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" containerID="7125bc754a8c0e626fcd2fe281119d9040b278b420956ad513958b106967fd43" exitCode=2 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.648591 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b","Type":"ContainerDied","Data":"7125bc754a8c0e626fcd2fe281119d9040b278b420956ad513958b106967fd43"} Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.660298 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-669bcbb856-w87fq"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.660818 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-669bcbb856-w87fq" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerName="placement-log" containerID="cri-o://54b0f1f5dfa2a405752216bff60fa798790887221bd217f21ee213ca02e2318b" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.661461 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-669bcbb856-w87fq" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerName="placement-api" containerID="cri-o://10f2b77e4d99df665d6349b311dc9b4cfe076067c636391f3d7e6e34202c3750" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.675226 4903 generic.go:334] "Generic (PLEG): container finished" podID="8639b665-721c-4dda-afe9-6e84f6f8a574" containerID="2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8" exitCode=2 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.675328 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8639b665-721c-4dda-afe9-6e84f6f8a574","Type":"ContainerDied","Data":"2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8"} Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.699108 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.699422 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" containerName="glance-log" containerID="cri-o://e9d3a5c1b4f80a807d17559887982f63b2f87da65ce30784f6d98d67b9f43363" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.699885 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" containerName="glance-httpd" containerID="cri-o://8c4db124274a0f1bc8c0540b96509dad9dbb16ca409f3c709b225f29f322a39d" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.739252 4903 generic.go:334] "Generic (PLEG): container finished" podID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerID="a2260ceafa26704cedbc21056c132c79bd5a3239f4a5a39d01b35eaa39143f86" exitCode=143 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.739305 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"877f943b-808c-435e-a5cf-bda8ea0a5d15","Type":"ContainerDied","Data":"a2260ceafa26704cedbc21056c132c79bd5a3239f4a5a39d01b35eaa39143f86"} Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.764707 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fbdb5-dn8w8"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.765987 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fbdb5-dn8w8" podUID="0790ef46-b8b6-4d5e-98a8-06319c232264" containerName="neutron-api" containerID="cri-o://492d0160f992341b5c4f630fecea542dfc91408228e35e56c2fb187fb62dd5de" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.766697 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fbdb5-dn8w8" podUID="0790ef46-b8b6-4d5e-98a8-06319c232264" containerName="neutron-httpd" containerID="cri-o://e720dfe8b4aae682033533f903e6ba534df6bbb629a2136de2d974617f1cbb66" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.790439 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mt4zt"] Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.807939 4903 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.809414 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data podName:888a3fd9-01f8-47b3-b1bb-f2b8b6b96509 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:40.80939564 +0000 UTC m=+1606.026295955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data") pod "rabbitmq-server-0" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509") : configmap "rabbitmq-config-data" not found Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.842150 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mt4zt"] Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.902629 4903 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 08:49:39 crc kubenswrapper[4903]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: if [ -n "" ]; then Mar 20 08:49:39 crc kubenswrapper[4903]: GRANT_DATABASE="" Mar 20 08:49:39 crc kubenswrapper[4903]: else Mar 20 08:49:39 crc kubenswrapper[4903]: GRANT_DATABASE="*" Mar 20 08:49:39 crc kubenswrapper[4903]: fi Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: # going for maximum compatibility here: Mar 20 08:49:39 crc kubenswrapper[4903]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 08:49:39 crc kubenswrapper[4903]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 08:49:39 crc kubenswrapper[4903]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 08:49:39 crc kubenswrapper[4903]: # support updates Mar 20 08:49:39 crc kubenswrapper[4903]: Mar 20 08:49:39 crc kubenswrapper[4903]: $MYSQL_CMD < logger="UnhandledError" Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.906298 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-87nxt" podUID="f8c76743-cd0d-48d8-940e-a5e750bd1fcc" Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.907911 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.908588 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-server" containerID="cri-o://ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.909224 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="swift-recon-cron" containerID="cri-o://9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.909295 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="rsync" containerID="cri-o://aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.909336 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-expirer" containerID="cri-o://970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.909373 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-updater" containerID="cri-o://9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.909407 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-auditor" containerID="cri-o://ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.910542 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-replicator" containerID="cri-o://68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.910594 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-server" containerID="cri-o://c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.910641 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-updater" containerID="cri-o://1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.910691 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-auditor" containerID="cri-o://0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.910741 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-replicator" containerID="cri-o://8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.910778 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-server" containerID="cri-o://ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.910989 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-reaper" containerID="cri-o://92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.911430 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-replicator" containerID="cri-o://2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.911855 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-auditor" containerID="cri-o://b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.914687 4903 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 20 08:49:39 crc kubenswrapper[4903]: E0320 08:49:39.914825 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data podName:df937948-08c4-447c-9450-07221ce76552 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:41.914801304 +0000 UTC m=+1607.131701619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data") pod "rabbitmq-cell1-server-0" (UID: "df937948-08c4-447c-9450-07221ce76552") : configmap "rabbitmq-cell1-config-data" not found Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.921113 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9mg76"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.942460 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9mg76"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.955080 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mdxf5"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.964377 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.964754 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3cdd4833-7200-46c0-9bb4-1b18c7828044" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077" gracePeriod=30 Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.979416 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mdxf5"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.985502 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-25f9-account-create-update-9n49h"] Mar 20 08:49:39 crc kubenswrapper[4903]: I0320 08:49:39.994401 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-zxbh6"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.005580 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7f568974ff-6t26g"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.005930 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7f568974ff-6t26g" podUID="ff49346f-602e-46f6-91c7-9c1966535720" containerName="barbican-worker-log" containerID="cri-o://87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.006050 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7f568974ff-6t26g" podUID="ff49346f-602e-46f6-91c7-9c1966535720" containerName="barbican-worker" containerID="cri-o://386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.024651 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hwngf"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.032158 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-zxbh6"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.049462 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5895dcfdfd-4gs9b"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.049915 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5895dcfdfd-4gs9b" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerName="barbican-api-log" containerID="cri-o://93597afa34681cad8c7e33fa9d9d2b8edff2db1b4f63723973a392f7d94f6d4d" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.050275 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5895dcfdfd-4gs9b" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerName="barbican-api" containerID="cri-o://399861c0aed674ac17caf4a76e74022608a67747c7b5e6718fd6d0bf4376c5d8" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.073065 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-8z25d"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.087558 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-8z25d"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.100276 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="6e4027bc-3929-4b8b-9538-ab67f779558c" containerName="galera" containerID="cri-o://fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a" gracePeriod=29 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.100456 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hwngf"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.108559 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-35b8-account-create-update-pdgzs"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.130407 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.146277 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qq45l"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.153754 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.154134 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" podUID="31664b72-a142-4656-88e8-84dd0cf18647" containerName="barbican-keystone-listener-log" containerID="cri-o://34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.154880 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" podUID="31664b72-a142-4656-88e8-84dd0cf18647" containerName="barbican-keystone-listener" containerID="cri-o://2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.162501 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-87nxt"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.178766 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-35b8-account-create-update-pdgzs"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.187019 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-69dc7db475-m968g"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.187359 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-69dc7db475-m968g" podUID="5e960802-5c0e-4800-853f-e23466958aec" containerName="proxy-httpd" containerID="cri-o://55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.187466 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-69dc7db475-m968g" podUID="5e960802-5c0e-4800-853f-e23466958aec" containerName="proxy-server" containerID="cri-o://2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.200134 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qq45l"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.210060 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.210265 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dc127483-5a42-4eea-8b8c-8a1382dced05" containerName="nova-scheduler-scheduler" containerID="cri-o://086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.219536 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-25f9-account-create-update-9n49h"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.229749 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-87nxt"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.240289 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.252292 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sxx4b"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.260797 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sxx4b"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.267444 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.267720 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="d570ab6f-6c5f-4255-b2ae-1966da262a0d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.297246 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="df937948-08c4-447c-9450-07221ce76552" containerName="rabbitmq" containerID="cri-o://80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804" gracePeriod=604800 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.297407 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" containerName="rabbitmq" containerID="cri-o://bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04" gracePeriod=604800 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.310081 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.310297 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="5e072c5e-0f44-4d24-bccc-b14bf61fa192" containerName="nova-cell1-conductor-conductor" containerID="cri-o://9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152" gracePeriod=30 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.366327 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4bdn7"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.370561 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.427522 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4bdn7"] Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.452146 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-config\") pod \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.452221 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-sb\") pod \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.452418 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-nb\") pod \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.452569 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-svc\") pod \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.452644 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj5pj\" (UniqueName: \"kubernetes.io/projected/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-kube-api-access-nj5pj\") pod \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.452757 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-swift-storage-0\") pod \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\" (UID: \"b286f9de-1973-4c7f-9350-4d3c31f9c1fb\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.476974 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_548096cf-0b33-4f2f-b8be-7d1ac859cf7c/ovsdbserver-nb/0.log" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.477093 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.486650 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-kube-api-access-nj5pj" (OuterVolumeSpecName: "kube-api-access-nj5pj") pod "b286f9de-1973-4c7f-9350-4d3c31f9c1fb" (UID: "b286f9de-1973-4c7f-9350-4d3c31f9c1fb"). InnerVolumeSpecName "kube-api-access-nj5pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.492316 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ch9dc_f390d60f-9967-4869-b09f-3cea4570186e/openstack-network-exporter/0.log" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.492424 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.546504 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovs-vswitchd" containerID="cri-o://cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" gracePeriod=29 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.552804 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.574650 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b286f9de-1973-4c7f-9350-4d3c31f9c1fb" (UID: "b286f9de-1973-4c7f-9350-4d3c31f9c1fb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.575664 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-config\") pod \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.575693 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-metrics-certs-tls-certs\") pod \"f390d60f-9967-4869-b09f-3cea4570186e\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.575725 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-combined-ca-bundle\") pod \"f390d60f-9967-4869-b09f-3cea4570186e\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.575775 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.575835 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f390d60f-9967-4869-b09f-3cea4570186e-config\") pod \"f390d60f-9967-4869-b09f-3cea4570186e\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.575929 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvdx8\" (UniqueName: \"kubernetes.io/projected/f390d60f-9967-4869-b09f-3cea4570186e-kube-api-access-dvdx8\") pod \"f390d60f-9967-4869-b09f-3cea4570186e\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.576010 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xddh9\" (UniqueName: \"kubernetes.io/projected/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-kube-api-access-xddh9\") pod \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.577070 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f390d60f-9967-4869-b09f-3cea4570186e-config" (OuterVolumeSpecName: "config") pod "f390d60f-9967-4869-b09f-3cea4570186e" (UID: "f390d60f-9967-4869-b09f-3cea4570186e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.577950 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-config" (OuterVolumeSpecName: "config") pod "548096cf-0b33-4f2f-b8be-7d1ac859cf7c" (UID: "548096cf-0b33-4f2f-b8be-7d1ac859cf7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.578447 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-combined-ca-bundle\") pod \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.578499 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-scripts\") pod \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.578537 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdbserver-nb-tls-certs\") pod \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.578610 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-metrics-certs-tls-certs\") pod \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.578632 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovs-rundir\") pod \"f390d60f-9967-4869-b09f-3cea4570186e\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.578683 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdb-rundir\") pod \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\" (UID: \"548096cf-0b33-4f2f-b8be-7d1ac859cf7c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.578695 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovn-rundir\") pod \"f390d60f-9967-4869-b09f-3cea4570186e\" (UID: \"f390d60f-9967-4869-b09f-3cea4570186e\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.580388 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "f390d60f-9967-4869-b09f-3cea4570186e" (UID: "f390d60f-9967-4869-b09f-3cea4570186e"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.583481 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.583534 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f390d60f-9967-4869-b09f-3cea4570186e-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.583546 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj5pj\" (UniqueName: \"kubernetes.io/projected/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-kube-api-access-nj5pj\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.583559 4903 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.584331 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "548096cf-0b33-4f2f-b8be-7d1ac859cf7c" (UID: "548096cf-0b33-4f2f-b8be-7d1ac859cf7c"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.586072 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "f390d60f-9967-4869-b09f-3cea4570186e" (UID: "f390d60f-9967-4869-b09f-3cea4570186e"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.589685 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-kube-api-access-xddh9" (OuterVolumeSpecName: "kube-api-access-xddh9") pod "548096cf-0b33-4f2f-b8be-7d1ac859cf7c" (UID: "548096cf-0b33-4f2f-b8be-7d1ac859cf7c"). InnerVolumeSpecName "kube-api-access-xddh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.593505 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f390d60f-9967-4869-b09f-3cea4570186e-kube-api-access-dvdx8" (OuterVolumeSpecName: "kube-api-access-dvdx8") pod "f390d60f-9967-4869-b09f-3cea4570186e" (UID: "f390d60f-9967-4869-b09f-3cea4570186e"). InnerVolumeSpecName "kube-api-access-dvdx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.594102 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "548096cf-0b33-4f2f-b8be-7d1ac859cf7c" (UID: "548096cf-0b33-4f2f-b8be-7d1ac859cf7c"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.595638 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-scripts" (OuterVolumeSpecName: "scripts") pod "548096cf-0b33-4f2f-b8be-7d1ac859cf7c" (UID: "548096cf-0b33-4f2f-b8be-7d1ac859cf7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.629149 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8639b665-721c-4dda-afe9-6e84f6f8a574/ovsdbserver-sb/0.log" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.629425 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.643400 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-69dc7db475-m968g" podUID="5e960802-5c0e-4800-853f-e23466958aec" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.175:8080/healthcheck\": dial tcp 10.217.0.175:8080: connect: connection refused" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.646449 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-config" (OuterVolumeSpecName: "config") pod "b286f9de-1973-4c7f-9350-4d3c31f9c1fb" (UID: "b286f9de-1973-4c7f-9350-4d3c31f9c1fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.647469 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-69dc7db475-m968g" podUID="5e960802-5c0e-4800-853f-e23466958aec" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.175:8080/healthcheck\": dial tcp 10.217.0.175:8080: connect: connection refused" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.679289 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b286f9de-1973-4c7f-9350-4d3c31f9c1fb" (UID: "b286f9de-1973-4c7f-9350-4d3c31f9c1fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.685548 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b286f9de-1973-4c7f-9350-4d3c31f9c1fb" (UID: "b286f9de-1973-4c7f-9350-4d3c31f9c1fb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.694249 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config\") pod \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.694564 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7vw\" (UniqueName: \"kubernetes.io/projected/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-kube-api-access-xc7vw\") pod \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.694968 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-combined-ca-bundle\") pod \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.695219 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config-secret\") pod \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\" (UID: \"4819c8dc-535a-4fb2-93ed-16eccdf8cd6c\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.696359 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvdx8\" (UniqueName: \"kubernetes.io/projected/f390d60f-9967-4869-b09f-3cea4570186e-kube-api-access-dvdx8\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.696533 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xddh9\" (UniqueName: \"kubernetes.io/projected/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-kube-api-access-xddh9\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.696710 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.696769 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.696844 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.696903 4903 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.696954 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.697004 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f390d60f-9967-4869-b09f-3cea4570186e-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.697094 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.697152 4903 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.722524 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.726920 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.731257 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.731329 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="d570ab6f-6c5f-4255-b2ae-1966da262a0d" containerName="nova-cell0-conductor-conductor" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.734702 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-kube-api-access-xc7vw" (OuterVolumeSpecName: "kube-api-access-xc7vw") pod "4819c8dc-535a-4fb2-93ed-16eccdf8cd6c" (UID: "4819c8dc-535a-4fb2-93ed-16eccdf8cd6c"). InnerVolumeSpecName "kube-api-access-xc7vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.750812 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f390d60f-9967-4869-b09f-3cea4570186e" (UID: "f390d60f-9967-4869-b09f-3cea4570186e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.753022 4903 generic.go:334] "Generic (PLEG): container finished" podID="31664b72-a142-4656-88e8-84dd0cf18647" containerID="34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8" exitCode=143 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.753176 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" event={"ID":"31664b72-a142-4656-88e8-84dd0cf18647","Type":"ContainerDied","Data":"34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.757457 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8639b665-721c-4dda-afe9-6e84f6f8a574/ovsdbserver-sb/0.log" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.757543 4903 generic.go:334] "Generic (PLEG): container finished" podID="8639b665-721c-4dda-afe9-6e84f6f8a574" containerID="81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a" exitCode=143 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.757618 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8639b665-721c-4dda-afe9-6e84f6f8a574","Type":"ContainerDied","Data":"81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.757656 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8639b665-721c-4dda-afe9-6e84f6f8a574","Type":"ContainerDied","Data":"505d0191451c5ecb3a6db6fcddf61679951a2efd3fa88075c72fddeb5d91d8dd"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.757676 4903 scope.go:117] "RemoveContainer" containerID="2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.757858 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.762207 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_548096cf-0b33-4f2f-b8be-7d1ac859cf7c/ovsdbserver-nb/0.log" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.762355 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.763431 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"548096cf-0b33-4f2f-b8be-7d1ac859cf7c","Type":"ContainerDied","Data":"d40cb7084592ee171758abb523548af0c26165735b34deee1818646086e6b1c6"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.768829 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" event={"ID":"b286f9de-1973-4c7f-9350-4d3c31f9c1fb","Type":"ContainerDied","Data":"61a78e5d734a556376d557849909a1b7509d6e1bd7fec610c76f6f64d0091c34"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.768851 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-9bgcj" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.771236 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ch9dc_f390d60f-9967-4869-b09f-3cea4570186e/openstack-network-exporter/0.log" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.771347 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ch9dc" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.771556 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ch9dc" event={"ID":"f390d60f-9967-4869-b09f-3cea4570186e","Type":"ContainerDied","Data":"5fbdbd73d47771816116bdfa56d33c2a46a6b2e86c66518ae3fbc9ff5e5493d2"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.774160 4903 generic.go:334] "Generic (PLEG): container finished" podID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerID="54b0f1f5dfa2a405752216bff60fa798790887221bd217f21ee213ca02e2318b" exitCode=143 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.774220 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-669bcbb856-w87fq" event={"ID":"1dcd96a1-71bb-480c-8387-0fca4d17bf33","Type":"ContainerDied","Data":"54b0f1f5dfa2a405752216bff60fa798790887221bd217f21ee213ca02e2318b"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.778286 4903 generic.go:334] "Generic (PLEG): container finished" podID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerID="bbc28588129fed5e832d9cf2c208bd4c746332410777ee79ad509494e640c235" exitCode=143 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.778352 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"50b5adcb-aed8-4cff-b3ec-02721df3937d","Type":"ContainerDied","Data":"bbc28588129fed5e832d9cf2c208bd4c746332410777ee79ad509494e640c235"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.791914 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.791949 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.791958 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.791984 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.791993 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792000 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792007 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792016 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792024 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792065 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792079 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792089 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792102 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792186 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792263 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792276 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792306 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792316 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792326 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792335 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792345 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792417 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792664 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792683 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792693 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.792703 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.798410 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8639b665-721c-4dda-afe9-6e84f6f8a574\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.798492 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk8ft\" (UniqueName: \"kubernetes.io/projected/8639b665-721c-4dda-afe9-6e84f6f8a574-kube-api-access-rk8ft\") pod \"8639b665-721c-4dda-afe9-6e84f6f8a574\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.798565 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-combined-ca-bundle\") pod \"8639b665-721c-4dda-afe9-6e84f6f8a574\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.798584 4903 generic.go:334] "Generic (PLEG): container finished" podID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerID="93597afa34681cad8c7e33fa9d9d2b8edff2db1b4f63723973a392f7d94f6d4d" exitCode=143 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.798649 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-metrics-certs-tls-certs\") pod \"8639b665-721c-4dda-afe9-6e84f6f8a574\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.798711 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdb-rundir\") pod \"8639b665-721c-4dda-afe9-6e84f6f8a574\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.798745 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-config\") pod \"8639b665-721c-4dda-afe9-6e84f6f8a574\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.798750 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5895dcfdfd-4gs9b" event={"ID":"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33","Type":"ContainerDied","Data":"93597afa34681cad8c7e33fa9d9d2b8edff2db1b4f63723973a392f7d94f6d4d"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.798799 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdbserver-sb-tls-certs\") pod \"8639b665-721c-4dda-afe9-6e84f6f8a574\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.798817 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-scripts\") pod \"8639b665-721c-4dda-afe9-6e84f6f8a574\" (UID: \"8639b665-721c-4dda-afe9-6e84f6f8a574\") " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.799247 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.799268 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7vw\" (UniqueName: \"kubernetes.io/projected/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-kube-api-access-xc7vw\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.799644 4903 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 20 08:49:40 crc kubenswrapper[4903]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 20 08:49:40 crc kubenswrapper[4903]: + source /usr/local/bin/container-scripts/functions Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNBridge=br-int Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNRemote=tcp:localhost:6642 Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNEncapType=geneve Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNAvailabilityZones= Mar 20 08:49:40 crc kubenswrapper[4903]: ++ EnableChassisAsGateway=true Mar 20 08:49:40 crc kubenswrapper[4903]: ++ PhysicalNetworks= Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNHostName= Mar 20 08:49:40 crc kubenswrapper[4903]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 20 08:49:40 crc kubenswrapper[4903]: ++ ovs_dir=/var/lib/openvswitch Mar 20 08:49:40 crc kubenswrapper[4903]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 20 08:49:40 crc kubenswrapper[4903]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 20 08:49:40 crc kubenswrapper[4903]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + sleep 0.5 Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + sleep 0.5 Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + sleep 0.5 Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + cleanup_ovsdb_server_semaphore Mar 20 08:49:40 crc kubenswrapper[4903]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 08:49:40 crc kubenswrapper[4903]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 20 08:49:40 crc kubenswrapper[4903]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-chrhv" message=< Mar 20 08:49:40 crc kubenswrapper[4903]: Exiting ovsdb-server (5) [ OK ] Mar 20 08:49:40 crc kubenswrapper[4903]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 20 08:49:40 crc kubenswrapper[4903]: + source /usr/local/bin/container-scripts/functions Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNBridge=br-int Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNRemote=tcp:localhost:6642 Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNEncapType=geneve Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNAvailabilityZones= Mar 20 08:49:40 crc kubenswrapper[4903]: ++ EnableChassisAsGateway=true Mar 20 08:49:40 crc kubenswrapper[4903]: ++ PhysicalNetworks= Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNHostName= Mar 20 08:49:40 crc kubenswrapper[4903]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 20 08:49:40 crc kubenswrapper[4903]: ++ ovs_dir=/var/lib/openvswitch Mar 20 08:49:40 crc kubenswrapper[4903]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 20 08:49:40 crc kubenswrapper[4903]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 20 08:49:40 crc kubenswrapper[4903]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + sleep 0.5 Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + sleep 0.5 Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + sleep 0.5 Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + cleanup_ovsdb_server_semaphore Mar 20 08:49:40 crc kubenswrapper[4903]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 08:49:40 crc kubenswrapper[4903]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 20 08:49:40 crc kubenswrapper[4903]: > Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.799677 4903 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 20 08:49:40 crc kubenswrapper[4903]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 20 08:49:40 crc kubenswrapper[4903]: + source /usr/local/bin/container-scripts/functions Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNBridge=br-int Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNRemote=tcp:localhost:6642 Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNEncapType=geneve Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNAvailabilityZones= Mar 20 08:49:40 crc kubenswrapper[4903]: ++ EnableChassisAsGateway=true Mar 20 08:49:40 crc kubenswrapper[4903]: ++ PhysicalNetworks= Mar 20 08:49:40 crc kubenswrapper[4903]: ++ OVNHostName= Mar 20 08:49:40 crc kubenswrapper[4903]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 20 08:49:40 crc kubenswrapper[4903]: ++ ovs_dir=/var/lib/openvswitch Mar 20 08:49:40 crc kubenswrapper[4903]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 20 08:49:40 crc kubenswrapper[4903]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 20 08:49:40 crc kubenswrapper[4903]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + sleep 0.5 Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + sleep 0.5 Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + sleep 0.5 Mar 20 08:49:40 crc kubenswrapper[4903]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 08:49:40 crc kubenswrapper[4903]: + cleanup_ovsdb_server_semaphore Mar 20 08:49:40 crc kubenswrapper[4903]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 08:49:40 crc kubenswrapper[4903]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 20 08:49:40 crc kubenswrapper[4903]: > pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" containerID="cri-o://ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.799710 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" containerID="cri-o://ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" gracePeriod=29 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.799828 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-scripts" (OuterVolumeSpecName: "scripts") pod "8639b665-721c-4dda-afe9-6e84f6f8a574" (UID: "8639b665-721c-4dda-afe9-6e84f6f8a574"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.800387 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "8639b665-721c-4dda-afe9-6e84f6f8a574" (UID: "8639b665-721c-4dda-afe9-6e84f6f8a574"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.800785 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-config" (OuterVolumeSpecName: "config") pod "8639b665-721c-4dda-afe9-6e84f6f8a574" (UID: "8639b665-721c-4dda-afe9-6e84f6f8a574"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.806374 4903 generic.go:334] "Generic (PLEG): container finished" podID="4819c8dc-535a-4fb2-93ed-16eccdf8cd6c" containerID="476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f" exitCode=137 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.806528 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.829630 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8639b665-721c-4dda-afe9-6e84f6f8a574-kube-api-access-rk8ft" (OuterVolumeSpecName: "kube-api-access-rk8ft") pod "8639b665-721c-4dda-afe9-6e84f6f8a574" (UID: "8639b665-721c-4dda-afe9-6e84f6f8a574"). InnerVolumeSpecName "kube-api-access-rk8ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.830217 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "8639b665-721c-4dda-afe9-6e84f6f8a574" (UID: "8639b665-721c-4dda-afe9-6e84f6f8a574"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.841185 4903 generic.go:334] "Generic (PLEG): container finished" podID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerID="90c27df17f81d979977526ee2d95fcef8b351eceda695edad6f1dfd00a676827" exitCode=143 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.841278 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f5f160c-29e2-43d0-bb55-6969904b3a4e","Type":"ContainerDied","Data":"90c27df17f81d979977526ee2d95fcef8b351eceda695edad6f1dfd00a676827"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.855591 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" containerID="e9d3a5c1b4f80a807d17559887982f63b2f87da65ce30784f6d98d67b9f43363" exitCode=143 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.855903 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679","Type":"ContainerDied","Data":"e9d3a5c1b4f80a807d17559887982f63b2f87da65ce30784f6d98d67b9f43363"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.867857 4903 generic.go:334] "Generic (PLEG): container finished" podID="0790ef46-b8b6-4d5e-98a8-06319c232264" containerID="e720dfe8b4aae682033533f903e6ba534df6bbb629a2136de2d974617f1cbb66" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.867953 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdb5-dn8w8" event={"ID":"0790ef46-b8b6-4d5e-98a8-06319c232264","Type":"ContainerDied","Data":"e720dfe8b4aae682033533f903e6ba534df6bbb629a2136de2d974617f1cbb66"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.870130 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25f9-account-create-update-9n49h" event={"ID":"2dc6a13a-9844-4e28-93e6-45025f1385a9","Type":"ContainerStarted","Data":"d98eba0071d357caab9e9fa49a5697d3236ce77862c464f07fff3b39743b7c7b"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.874136 4903 scope.go:117] "RemoveContainer" containerID="81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.874946 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4819c8dc-535a-4fb2-93ed-16eccdf8cd6c" (UID: "4819c8dc-535a-4fb2-93ed-16eccdf8cd6c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.878902 4903 generic.go:334] "Generic (PLEG): container finished" podID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" containerID="e7046c98c3546d6698b9ba6c5237b460ed8efe52b7e38c2e607164140f59d3d5" exitCode=143 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.878970 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c33b2cd-e705-41cd-9e59-3dcbb0a55829","Type":"ContainerDied","Data":"e7046c98c3546d6698b9ba6c5237b460ed8efe52b7e38c2e607164140f59d3d5"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.898642 4903 generic.go:334] "Generic (PLEG): container finished" podID="ff49346f-602e-46f6-91c7-9c1966535720" containerID="87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b" exitCode=143 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.898951 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f568974ff-6t26g" event={"ID":"ff49346f-602e-46f6-91c7-9c1966535720","Type":"ContainerDied","Data":"87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.903132 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.903160 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.903171 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8639b665-721c-4dda-afe9-6e84f6f8a574-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.903189 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.903321 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk8ft\" (UniqueName: \"kubernetes.io/projected/8639b665-721c-4dda-afe9-6e84f6f8a574-kube-api-access-rk8ft\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.903338 4903 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.903448 4903 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.903555 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data podName:888a3fd9-01f8-47b3-b1bb-f2b8b6b96509 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:42.903537323 +0000 UTC m=+1608.120437638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data") pod "rabbitmq-server-0" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509") : configmap "rabbitmq-config-data" not found Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.911218 4903 generic.go:334] "Generic (PLEG): container finished" podID="5e960802-5c0e-4800-853f-e23466958aec" containerID="55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188" exitCode=0 Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.911324 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69dc7db475-m968g" event={"ID":"5e960802-5c0e-4800-853f-e23466958aec","Type":"ContainerDied","Data":"55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.913307 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-87nxt" event={"ID":"f8c76743-cd0d-48d8-940e-a5e750bd1fcc","Type":"ContainerStarted","Data":"d86832d4b747a1532b2b93a3830dc91d0c0eb7a7b3779e4b1452ef30186095dc"} Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.913667 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b286f9de-1973-4c7f-9350-4d3c31f9c1fb" (UID: "b286f9de-1973-4c7f-9350-4d3c31f9c1fb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.914133 4903 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-87nxt" secret="" err="secret \"galera-openstack-cell1-dockercfg-kdtl4\" not found" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.917231 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "548096cf-0b33-4f2f-b8be-7d1ac859cf7c" (UID: "548096cf-0b33-4f2f-b8be-7d1ac859cf7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.967458 4903 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 08:49:40 crc kubenswrapper[4903]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: if [ -n "" ]; then Mar 20 08:49:40 crc kubenswrapper[4903]: GRANT_DATABASE="" Mar 20 08:49:40 crc kubenswrapper[4903]: else Mar 20 08:49:40 crc kubenswrapper[4903]: GRANT_DATABASE="*" Mar 20 08:49:40 crc kubenswrapper[4903]: fi Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: # going for maximum compatibility here: Mar 20 08:49:40 crc kubenswrapper[4903]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 08:49:40 crc kubenswrapper[4903]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 08:49:40 crc kubenswrapper[4903]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 08:49:40 crc kubenswrapper[4903]: # support updates Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: $MYSQL_CMD < logger="UnhandledError" Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.968615 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-87nxt" podUID="f8c76743-cd0d-48d8-940e-a5e750bd1fcc" Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.969966 4903 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 08:49:40 crc kubenswrapper[4903]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: if [ -n "nova_api" ]; then Mar 20 08:49:40 crc kubenswrapper[4903]: GRANT_DATABASE="nova_api" Mar 20 08:49:40 crc kubenswrapper[4903]: else Mar 20 08:49:40 crc kubenswrapper[4903]: GRANT_DATABASE="*" Mar 20 08:49:40 crc kubenswrapper[4903]: fi Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: # going for maximum compatibility here: Mar 20 08:49:40 crc kubenswrapper[4903]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 08:49:40 crc kubenswrapper[4903]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 08:49:40 crc kubenswrapper[4903]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 08:49:40 crc kubenswrapper[4903]: # support updates Mar 20 08:49:40 crc kubenswrapper[4903]: Mar 20 08:49:40 crc kubenswrapper[4903]: $MYSQL_CMD < logger="UnhandledError" Mar 20 08:49:40 crc kubenswrapper[4903]: E0320 08:49:40.973634 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-25f9-account-create-update-9n49h" podUID="2dc6a13a-9844-4e28-93e6-45025f1385a9" Mar 20 08:49:40 crc kubenswrapper[4903]: I0320 08:49:40.986489 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4819c8dc-535a-4fb2-93ed-16eccdf8cd6c" (UID: "4819c8dc-535a-4fb2-93ed-16eccdf8cd6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.014203 4903 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.014291 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts podName:f8c76743-cd0d-48d8-940e-a5e750bd1fcc nodeName:}" failed. No retries permitted until 2026-03-20 08:49:41.514267757 +0000 UTC m=+1606.731168072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts") pod "root-account-create-update-87nxt" (UID: "f8c76743-cd0d-48d8-940e-a5e750bd1fcc") : configmap "openstack-cell1-scripts" not found Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.014443 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b286f9de-1973-4c7f-9350-4d3c31f9c1fb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.014472 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.014483 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.014575 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8639b665-721c-4dda-afe9-6e84f6f8a574" (UID: "8639b665-721c-4dda-afe9-6e84f6f8a574"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.034263 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.106132 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.119299 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.119332 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.119350 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.133454 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f390d60f-9967-4869-b09f-3cea4570186e" (UID: "f390d60f-9967-4869-b09f-3cea4570186e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.139643 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4819c8dc-535a-4fb2-93ed-16eccdf8cd6c" (UID: "4819c8dc-535a-4fb2-93ed-16eccdf8cd6c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.174936 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "548096cf-0b33-4f2f-b8be-7d1ac859cf7c" (UID: "548096cf-0b33-4f2f-b8be-7d1ac859cf7c"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.201211 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8639b665-721c-4dda-afe9-6e84f6f8a574" (UID: "8639b665-721c-4dda-afe9-6e84f6f8a574"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.221506 4903 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.221541 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.221550 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.221559 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f390d60f-9967-4869-b09f-3cea4570186e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.221644 4903 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.221660 4903 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.221668 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.221679 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.221728 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:45.221712233 +0000 UTC m=+1610.438612548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.240483 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "548096cf-0b33-4f2f-b8be-7d1ac859cf7c" (UID: "548096cf-0b33-4f2f-b8be-7d1ac859cf7c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.294958 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "8639b665-721c-4dda-afe9-6e84f6f8a574" (UID: "8639b665-721c-4dda-afe9-6e84f6f8a574"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.324735 4903 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8639b665-721c-4dda-afe9-6e84f6f8a574-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.324762 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/548096cf-0b33-4f2f-b8be-7d1ac859cf7c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.392888 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.395338 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9bgcj"] Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.404459 4903 scope.go:117] "RemoveContainer" containerID="2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8" Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.404953 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8\": container with ID starting with 2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8 not found: ID does not exist" containerID="2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.404991 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8"} err="failed to get container status \"2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8\": rpc error: code = NotFound desc = could not find container \"2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8\": container with ID starting with 2c5aaed35251d97c58eb1ac9b7cc60388845eb3364a0b5ed8ae63369e67595d8 not found: ID does not exist" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.405024 4903 scope.go:117] "RemoveContainer" containerID="81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a" Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.405521 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a\": container with ID starting with 81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a not found: ID does not exist" containerID="81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.405574 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a"} err="failed to get container status \"81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a\": rpc error: code = NotFound desc = could not find container \"81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a\": container with ID starting with 81809a955a2f71afa7a77cb91e2146e2f4c354fdfdadbb9e1e6afb608b8a1b6a not found: ID does not exist" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.405608 4903 scope.go:117] "RemoveContainer" containerID="a4feda7340d51a69a0ade15a8a02097cea235438a9973df73914b9990363b816" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.406921 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-9bgcj"] Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.446536 4903 scope.go:117] "RemoveContainer" containerID="f3da15fbb37c4bc9e79ae4139cd2ae68204d101f6f4da2f8431701cc743f6de0" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.456446 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.489073 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.489835 4903 scope.go:117] "RemoveContainer" containerID="1c1a4fb611dcbe8e6e5c2845b3c1260632a46a79e55826154d5b3126dc0b4a4e" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.504817 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1642398c-5346-421f-86c2-baec0001304e" path="/var/lib/kubelet/pods/1642398c-5346-421f-86c2-baec0001304e/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.505816 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a1287a5-35f8-4d1a-8a4d-e7b30c957c07" path="/var/lib/kubelet/pods/1a1287a5-35f8-4d1a-8a4d-e7b30c957c07/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.506500 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4819c8dc-535a-4fb2-93ed-16eccdf8cd6c" path="/var/lib/kubelet/pods/4819c8dc-535a-4fb2-93ed-16eccdf8cd6c/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.507582 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb8fa20-aceb-4b12-b104-e1594993b20c" path="/var/lib/kubelet/pods/4bb8fa20-aceb-4b12-b104-e1594993b20c/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.508148 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6214cfc3-afe8-4e2c-aafe-d59d16b108b5" path="/var/lib/kubelet/pods/6214cfc3-afe8-4e2c-aafe-d59d16b108b5/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.508923 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc69384-efae-4c4f-be81-591b2cd17538" path="/var/lib/kubelet/pods/7bc69384-efae-4c4f-be81-591b2cd17538/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.509714 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e4d3d5e-2374-4583-8193-bcef8b16110e" path="/var/lib/kubelet/pods/7e4d3d5e-2374-4583-8193-bcef8b16110e/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.510817 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8639b665-721c-4dda-afe9-6e84f6f8a574" path="/var/lib/kubelet/pods/8639b665-721c-4dda-afe9-6e84f6f8a574/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.511479 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b1baed-f9ed-4ec9-8dd0-adc4db771821" path="/var/lib/kubelet/pods/b1b1baed-f9ed-4ec9-8dd0-adc4db771821/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.511971 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b286f9de-1973-4c7f-9350-4d3c31f9c1fb" path="/var/lib/kubelet/pods/b286f9de-1973-4c7f-9350-4d3c31f9c1fb/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.513273 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f" path="/var/lib/kubelet/pods/bf0235fb-bbd5-4f7e-8c25-8230f6e48f2f/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.514024 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfaa735a-cdf6-40c4-ab7f-42605e13127c" path="/var/lib/kubelet/pods/bfaa735a-cdf6-40c4-ab7f-42605e13127c/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.514623 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6fea6b7-8fd2-42c4-8016-334f6f69c22e" path="/var/lib/kubelet/pods/c6fea6b7-8fd2-42c4-8016-334f6f69c22e/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.515783 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e54fecbe-7055-446e-989b-eddbbbfe55a6" path="/var/lib/kubelet/pods/e54fecbe-7055-446e-989b-eddbbbfe55a6/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.517454 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe39352c-cfbc-4c65-8ad2-0e51b78bbec1" path="/var/lib/kubelet/pods/fe39352c-cfbc-4c65-8ad2-0e51b78bbec1/volumes" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.520961 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.520997 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.521015 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-ch9dc"] Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.523643 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-ch9dc"] Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.527616 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data-custom\") pod \"ff49346f-602e-46f6-91c7-9c1966535720\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.527754 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff49346f-602e-46f6-91c7-9c1966535720-logs\") pod \"ff49346f-602e-46f6-91c7-9c1966535720\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.527846 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data\") pod \"ff49346f-602e-46f6-91c7-9c1966535720\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.527938 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-combined-ca-bundle\") pod \"ff49346f-602e-46f6-91c7-9c1966535720\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.527970 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n689p\" (UniqueName: \"kubernetes.io/projected/ff49346f-602e-46f6-91c7-9c1966535720-kube-api-access-n689p\") pod \"ff49346f-602e-46f6-91c7-9c1966535720\" (UID: \"ff49346f-602e-46f6-91c7-9c1966535720\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.528480 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff49346f-602e-46f6-91c7-9c1966535720-logs" (OuterVolumeSpecName: "logs") pod "ff49346f-602e-46f6-91c7-9c1966535720" (UID: "ff49346f-602e-46f6-91c7-9c1966535720"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.528521 4903 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.528605 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts podName:f8c76743-cd0d-48d8-940e-a5e750bd1fcc nodeName:}" failed. No retries permitted until 2026-03-20 08:49:42.528556495 +0000 UTC m=+1607.745456810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts") pod "root-account-create-update-87nxt" (UID: "f8c76743-cd0d-48d8-940e-a5e750bd1fcc") : configmap "openstack-cell1-scripts" not found Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.534608 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ff49346f-602e-46f6-91c7-9c1966535720" (UID: "ff49346f-602e-46f6-91c7-9c1966535720"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.535493 4903 scope.go:117] "RemoveContainer" containerID="af0bf0783940c5c71ca13e53f3a973ac3692e87a8058eafa8c1eec0382459e15" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.537042 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff49346f-602e-46f6-91c7-9c1966535720-kube-api-access-n689p" (OuterVolumeSpecName: "kube-api-access-n689p") pod "ff49346f-602e-46f6-91c7-9c1966535720" (UID: "ff49346f-602e-46f6-91c7-9c1966535720"). InnerVolumeSpecName "kube-api-access-n689p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.580247 4903 scope.go:117] "RemoveContainer" containerID="16ea7c452104939a759a6fd64f9392015afdf41df82ecdb14f0a860eb6f425c4" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.589797 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff49346f-602e-46f6-91c7-9c1966535720" (UID: "ff49346f-602e-46f6-91c7-9c1966535720"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.602577 4903 scope.go:117] "RemoveContainer" containerID="476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.610564 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data" (OuterVolumeSpecName: "config-data") pod "ff49346f-602e-46f6-91c7-9c1966535720" (UID: "ff49346f-602e-46f6-91c7-9c1966535720"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.614738 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.627090 4903 scope.go:117] "RemoveContainer" containerID="476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f" Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.628320 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f\": container with ID starting with 476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f not found: ID does not exist" containerID="476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.628352 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f"} err="failed to get container status \"476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f\": rpc error: code = NotFound desc = could not find container \"476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f\": container with ID starting with 476fc647c47bf21b8cc8bf445e61751a911dafeb475bbe3668beb1384e865c9f not found: ID does not exist" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.631602 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.631652 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.631672 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n689p\" (UniqueName: \"kubernetes.io/projected/ff49346f-602e-46f6-91c7-9c1966535720-kube-api-access-n689p\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.631687 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff49346f-602e-46f6-91c7-9c1966535720-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.631699 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff49346f-602e-46f6-91c7-9c1966535720-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.710514 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.732811 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-vencrypt-tls-certs\") pod \"3cdd4833-7200-46c0-9bb4-1b18c7828044\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.732879 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-nova-novncproxy-tls-certs\") pod \"3cdd4833-7200-46c0-9bb4-1b18c7828044\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.732928 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-combined-ca-bundle\") pod \"3cdd4833-7200-46c0-9bb4-1b18c7828044\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.733005 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2qzc\" (UniqueName: \"kubernetes.io/projected/3cdd4833-7200-46c0-9bb4-1b18c7828044-kube-api-access-p2qzc\") pod \"3cdd4833-7200-46c0-9bb4-1b18c7828044\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.733146 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-config-data\") pod \"3cdd4833-7200-46c0-9bb4-1b18c7828044\" (UID: \"3cdd4833-7200-46c0-9bb4-1b18c7828044\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.741828 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cdd4833-7200-46c0-9bb4-1b18c7828044-kube-api-access-p2qzc" (OuterVolumeSpecName: "kube-api-access-p2qzc") pod "3cdd4833-7200-46c0-9bb4-1b18c7828044" (UID: "3cdd4833-7200-46c0-9bb4-1b18c7828044"). InnerVolumeSpecName "kube-api-access-p2qzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.765390 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.766021 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cdd4833-7200-46c0-9bb4-1b18c7828044" (UID: "3cdd4833-7200-46c0-9bb4-1b18c7828044"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.788551 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-config-data" (OuterVolumeSpecName: "config-data") pod "3cdd4833-7200-46c0-9bb4-1b18c7828044" (UID: "3cdd4833-7200-46c0-9bb4-1b18c7828044"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.799307 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "3cdd4833-7200-46c0-9bb4-1b18c7828044" (UID: "3cdd4833-7200-46c0-9bb4-1b18c7828044"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.821936 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "3cdd4833-7200-46c0-9bb4-1b18c7828044" (UID: "3cdd4833-7200-46c0-9bb4-1b18c7828044"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.837713 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-public-tls-certs\") pod \"5e960802-5c0e-4800-853f-e23466958aec\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.837853 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-log-httpd\") pod \"5e960802-5c0e-4800-853f-e23466958aec\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.837873 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-run-httpd\") pod \"5e960802-5c0e-4800-853f-e23466958aec\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.837920 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-etc-swift\") pod \"5e960802-5c0e-4800-853f-e23466958aec\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.837949 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-internal-tls-certs\") pod \"5e960802-5c0e-4800-853f-e23466958aec\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.837991 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-combined-ca-bundle\") pod \"5e960802-5c0e-4800-853f-e23466958aec\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.838052 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-config-data\") pod \"5e960802-5c0e-4800-853f-e23466958aec\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.838073 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p7vz\" (UniqueName: \"kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-kube-api-access-9p7vz\") pod \"5e960802-5c0e-4800-853f-e23466958aec\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.838496 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5e960802-5c0e-4800-853f-e23466958aec" (UID: "5e960802-5c0e-4800-853f-e23466958aec"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.838533 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.838607 4903 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.838626 4903 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.838643 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cdd4833-7200-46c0-9bb4-1b18c7828044-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.838654 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2qzc\" (UniqueName: \"kubernetes.io/projected/3cdd4833-7200-46c0-9bb4-1b18c7828044-kube-api-access-p2qzc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.842452 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5e960802-5c0e-4800-853f-e23466958aec" (UID: "5e960802-5c0e-4800-853f-e23466958aec"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.844178 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5e960802-5c0e-4800-853f-e23466958aec" (UID: "5e960802-5c0e-4800-853f-e23466958aec"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.844442 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-kube-api-access-9p7vz" (OuterVolumeSpecName: "kube-api-access-9p7vz") pod "5e960802-5c0e-4800-853f-e23466958aec" (UID: "5e960802-5c0e-4800-853f-e23466958aec"). InnerVolumeSpecName "kube-api-access-9p7vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.898836 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5e960802-5c0e-4800-853f-e23466958aec" (UID: "5e960802-5c0e-4800-853f-e23466958aec"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.935485 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-combined-ca-bundle podName:5e960802-5c0e-4800-853f-e23466958aec nodeName:}" failed. No retries permitted until 2026-03-20 08:49:42.43544068 +0000 UTC m=+1607.652340995 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-combined-ca-bundle") pod "5e960802-5c0e-4800-853f-e23466958aec" (UID: "5e960802-5c0e-4800-853f-e23466958aec") : error deleting /var/lib/kubelet/pods/5e960802-5c0e-4800-853f-e23466958aec/volume-subpaths: remove /var/lib/kubelet/pods/5e960802-5c0e-4800-853f-e23466958aec/volume-subpaths: no such file or directory Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.937494 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-config-data" (OuterVolumeSpecName: "config-data") pod "5e960802-5c0e-4800-853f-e23466958aec" (UID: "5e960802-5c0e-4800-853f-e23466958aec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.939647 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-operator-scripts\") pod \"6e4027bc-3929-4b8b-9538-ab67f779558c\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.939712 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-combined-ca-bundle\") pod \"6e4027bc-3929-4b8b-9538-ab67f779558c\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.939759 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw5bc\" (UniqueName: \"kubernetes.io/projected/6e4027bc-3929-4b8b-9538-ab67f779558c-kube-api-access-dw5bc\") pod \"6e4027bc-3929-4b8b-9538-ab67f779558c\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.939833 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6e4027bc-3929-4b8b-9538-ab67f779558c\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.939952 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-kolla-config\") pod \"6e4027bc-3929-4b8b-9538-ab67f779558c\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.939978 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-default\") pod \"6e4027bc-3929-4b8b-9538-ab67f779558c\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.940008 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-galera-tls-certs\") pod \"6e4027bc-3929-4b8b-9538-ab67f779558c\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.940128 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-generated\") pod \"6e4027bc-3929-4b8b-9538-ab67f779558c\" (UID: \"6e4027bc-3929-4b8b-9538-ab67f779558c\") " Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.940537 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e4027bc-3929-4b8b-9538-ab67f779558c" (UID: "6e4027bc-3929-4b8b-9538-ab67f779558c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.940954 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "6e4027bc-3929-4b8b-9538-ab67f779558c" (UID: "6e4027bc-3929-4b8b-9538-ab67f779558c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.941442 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.941467 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p7vz\" (UniqueName: \"kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-kube-api-access-9p7vz\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.941481 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.941494 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.941506 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e960802-5c0e-4800-853f-e23466958aec-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.941518 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.941528 4903 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5e960802-5c0e-4800-853f-e23466958aec-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.941543 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.941490 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "6e4027bc-3929-4b8b-9538-ab67f779558c" (UID: "6e4027bc-3929-4b8b-9538-ab67f779558c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.941562 4903 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 20 08:49:41 crc kubenswrapper[4903]: E0320 08:49:41.941623 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data podName:df937948-08c4-447c-9450-07221ce76552 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:45.941607061 +0000 UTC m=+1611.158507376 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data") pod "rabbitmq-cell1-server-0" (UID: "df937948-08c4-447c-9450-07221ce76552") : configmap "rabbitmq-cell1-config-data" not found Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.941961 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6e4027bc-3929-4b8b-9538-ab67f779558c" (UID: "6e4027bc-3929-4b8b-9538-ab67f779558c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.947919 4903 generic.go:334] "Generic (PLEG): container finished" podID="6e4027bc-3929-4b8b-9538-ab67f779558c" containerID="fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a" exitCode=0 Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.948002 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6e4027bc-3929-4b8b-9538-ab67f779558c","Type":"ContainerDied","Data":"fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a"} Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.948061 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6e4027bc-3929-4b8b-9538-ab67f779558c","Type":"ContainerDied","Data":"fa3b33570168c7b6cfde01bb87b4f92251ca9fed546cc635ba6b71bd9474b4f9"} Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.948091 4903 scope.go:117] "RemoveContainer" containerID="fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.948251 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.951047 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e4027bc-3929-4b8b-9538-ab67f779558c-kube-api-access-dw5bc" (OuterVolumeSpecName: "kube-api-access-dw5bc") pod "6e4027bc-3929-4b8b-9538-ab67f779558c" (UID: "6e4027bc-3929-4b8b-9538-ab67f779558c"). InnerVolumeSpecName "kube-api-access-dw5bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.952524 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5e960802-5c0e-4800-853f-e23466958aec" (UID: "5e960802-5c0e-4800-853f-e23466958aec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.977794 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e4027bc-3929-4b8b-9538-ab67f779558c" (UID: "6e4027bc-3929-4b8b-9538-ab67f779558c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.980771 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "6e4027bc-3929-4b8b-9538-ab67f779558c" (UID: "6e4027bc-3929-4b8b-9538-ab67f779558c"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.983778 4903 generic.go:334] "Generic (PLEG): container finished" podID="ff49346f-602e-46f6-91c7-9c1966535720" containerID="386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2" exitCode=0 Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.983912 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f568974ff-6t26g" event={"ID":"ff49346f-602e-46f6-91c7-9c1966535720","Type":"ContainerDied","Data":"386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2"} Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.983948 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f568974ff-6t26g" event={"ID":"ff49346f-602e-46f6-91c7-9c1966535720","Type":"ContainerDied","Data":"308c4a6d0714cf150a805fb224c9dececa0d7e873ed3e8018dba7f6a88b23ec4"} Mar 20 08:49:41 crc kubenswrapper[4903]: I0320 08:49:41.984217 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7f568974ff-6t26g" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.001525 4903 generic.go:334] "Generic (PLEG): container finished" podID="5e960802-5c0e-4800-853f-e23466958aec" containerID="2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a" exitCode=0 Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.001702 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69dc7db475-m968g" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.001773 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69dc7db475-m968g" event={"ID":"5e960802-5c0e-4800-853f-e23466958aec","Type":"ContainerDied","Data":"2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a"} Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.001821 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69dc7db475-m968g" event={"ID":"5e960802-5c0e-4800-853f-e23466958aec","Type":"ContainerDied","Data":"2a70810d257fa13ad9dfd79a6bcf06a2ef6a3ede083f1f92c82d401d3e420d31"} Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.007981 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "6e4027bc-3929-4b8b-9538-ab67f779558c" (UID: "6e4027bc-3929-4b8b-9538-ab67f779558c"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.014957 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88" exitCode=0 Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.015081 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88"} Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.017589 4903 generic.go:334] "Generic (PLEG): container finished" podID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" containerID="4cb95c6e5180c8a94f8474533f104c8ca5edf99bbf830ed9f71c73b944a44ab1" exitCode=0 Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.017704 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a","Type":"ContainerDied","Data":"4cb95c6e5180c8a94f8474533f104c8ca5edf99bbf830ed9f71c73b944a44ab1"} Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.024173 4903 generic.go:334] "Generic (PLEG): container finished" podID="3cdd4833-7200-46c0-9bb4-1b18c7828044" containerID="9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077" exitCode=0 Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.024237 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3cdd4833-7200-46c0-9bb4-1b18c7828044","Type":"ContainerDied","Data":"9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077"} Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.024315 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3cdd4833-7200-46c0-9bb4-1b18c7828044","Type":"ContainerDied","Data":"a986c4bf7d083fcdad9e045170a88b4a1623b9e97068ea6e28387ae7d9354d6a"} Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.024941 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.049411 4903 generic.go:334] "Generic (PLEG): container finished" podID="d69915e4-0df8-4d83-b096-962eadc1883f" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" exitCode=0 Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.049553 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-chrhv" event={"ID":"d69915e4-0df8-4d83-b096-962eadc1883f","Type":"ContainerDied","Data":"ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677"} Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.069706 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.069766 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.069792 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw5bc\" (UniqueName: \"kubernetes.io/projected/6e4027bc-3929-4b8b-9538-ab67f779558c-kube-api-access-dw5bc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.069844 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.069858 4903 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.069880 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6e4027bc-3929-4b8b-9538-ab67f779558c-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.069892 4903 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e4027bc-3929-4b8b-9538-ab67f779558c-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.088382 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.089799 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.102882 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.104460 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.104524 4903 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.105272 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.115374 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7f568974ff-6t26g"] Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.115671 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.115763 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovs-vswitchd" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.119793 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.138160 4903 scope.go:117] "RemoveContainer" containerID="35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.145217 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7f568974ff-6t26g"] Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.172218 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.181642 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wdtrn" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerName="ovn-controller" probeResult="failure" output="" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.183220 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.184646 4903 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 20 08:49:42 crc kubenswrapper[4903]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-20T08:49:40Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 20 08:49:42 crc kubenswrapper[4903]: /etc/init.d/functions: line 589: 470 Alarm clock "$@" Mar 20 08:49:42 crc kubenswrapper[4903]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-wdtrn" message=< Mar 20 08:49:42 crc kubenswrapper[4903]: Exiting ovn-controller (1) [FAILED] Mar 20 08:49:42 crc kubenswrapper[4903]: Killing ovn-controller (1) [ OK ] Mar 20 08:49:42 crc kubenswrapper[4903]: 2026-03-20T08:49:40Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 20 08:49:42 crc kubenswrapper[4903]: /etc/init.d/functions: line 589: 470 Alarm clock "$@" Mar 20 08:49:42 crc kubenswrapper[4903]: > Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.184690 4903 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 20 08:49:42 crc kubenswrapper[4903]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-20T08:49:40Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 20 08:49:42 crc kubenswrapper[4903]: /etc/init.d/functions: line 589: 470 Alarm clock "$@" Mar 20 08:49:42 crc kubenswrapper[4903]: > pod="openstack/ovn-controller-wdtrn" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerName="ovn-controller" containerID="cri-o://2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.184721 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-wdtrn" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerName="ovn-controller" containerID="cri-o://2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f" gracePeriod=27 Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.193720 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.225524 4903 scope.go:117] "RemoveContainer" containerID="fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.226805 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a\": container with ID starting with fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a not found: ID does not exist" containerID="fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.226861 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a"} err="failed to get container status \"fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a\": rpc error: code = NotFound desc = could not find container \"fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a\": container with ID starting with fafa3d8b5242ec379ddf7f9373ebb300af0123e85fab400fdab43665a984905a not found: ID does not exist" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.226890 4903 scope.go:117] "RemoveContainer" containerID="35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.227316 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f\": container with ID starting with 35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f not found: ID does not exist" containerID="35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.227373 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f"} err="failed to get container status \"35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f\": rpc error: code = NotFound desc = could not find container \"35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f\": container with ID starting with 35a21656eef5a0292c24fe38368da2669464d638035c6919ad8e6d647c9b4b2f not found: ID does not exist" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.227396 4903 scope.go:117] "RemoveContainer" containerID="386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.336158 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-56bn4"] Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337546 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff49346f-602e-46f6-91c7-9c1966535720" containerName="barbican-worker" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337569 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff49346f-602e-46f6-91c7-9c1966535720" containerName="barbican-worker" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337583 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b286f9de-1973-4c7f-9350-4d3c31f9c1fb" containerName="dnsmasq-dns" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337590 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b286f9de-1973-4c7f-9350-4d3c31f9c1fb" containerName="dnsmasq-dns" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337603 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8639b665-721c-4dda-afe9-6e84f6f8a574" containerName="ovsdbserver-sb" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337610 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8639b665-721c-4dda-afe9-6e84f6f8a574" containerName="ovsdbserver-sb" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337620 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e960802-5c0e-4800-853f-e23466958aec" containerName="proxy-server" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337627 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e960802-5c0e-4800-853f-e23466958aec" containerName="proxy-server" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337638 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4027bc-3929-4b8b-9538-ab67f779558c" containerName="galera" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337648 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4027bc-3929-4b8b-9538-ab67f779558c" containerName="galera" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337660 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e960802-5c0e-4800-853f-e23466958aec" containerName="proxy-httpd" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337668 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e960802-5c0e-4800-853f-e23466958aec" containerName="proxy-httpd" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337680 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f390d60f-9967-4869-b09f-3cea4570186e" containerName="openstack-network-exporter" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337688 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f390d60f-9967-4869-b09f-3cea4570186e" containerName="openstack-network-exporter" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337698 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cdd4833-7200-46c0-9bb4-1b18c7828044" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337706 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cdd4833-7200-46c0-9bb4-1b18c7828044" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337715 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerName="ovsdbserver-nb" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337722 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerName="ovsdbserver-nb" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337740 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerName="openstack-network-exporter" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337746 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerName="openstack-network-exporter" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337756 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8639b665-721c-4dda-afe9-6e84f6f8a574" containerName="openstack-network-exporter" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337763 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8639b665-721c-4dda-afe9-6e84f6f8a574" containerName="openstack-network-exporter" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337777 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4027bc-3929-4b8b-9538-ab67f779558c" containerName="mysql-bootstrap" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337783 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4027bc-3929-4b8b-9538-ab67f779558c" containerName="mysql-bootstrap" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337878 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff49346f-602e-46f6-91c7-9c1966535720" containerName="barbican-worker-log" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337889 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff49346f-602e-46f6-91c7-9c1966535720" containerName="barbican-worker-log" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.337907 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b286f9de-1973-4c7f-9350-4d3c31f9c1fb" containerName="init" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.337915 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="b286f9de-1973-4c7f-9350-4d3c31f9c1fb" containerName="init" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338131 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerName="ovsdbserver-nb" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338155 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8639b665-721c-4dda-afe9-6e84f6f8a574" containerName="openstack-network-exporter" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338167 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff49346f-602e-46f6-91c7-9c1966535720" containerName="barbican-worker" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338180 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff49346f-602e-46f6-91c7-9c1966535720" containerName="barbican-worker-log" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338195 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f390d60f-9967-4869-b09f-3cea4570186e" containerName="openstack-network-exporter" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338208 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cdd4833-7200-46c0-9bb4-1b18c7828044" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338237 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="b286f9de-1973-4c7f-9350-4d3c31f9c1fb" containerName="dnsmasq-dns" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338246 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e960802-5c0e-4800-853f-e23466958aec" containerName="proxy-httpd" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338259 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4027bc-3929-4b8b-9538-ab67f779558c" containerName="galera" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338272 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8639b665-721c-4dda-afe9-6e84f6f8a574" containerName="ovsdbserver-sb" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338279 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e960802-5c0e-4800-853f-e23466958aec" containerName="proxy-server" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.338288 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" containerName="openstack-network-exporter" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.339090 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-56bn4" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.341460 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.346873 4903 scope.go:117] "RemoveContainer" containerID="87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.351419 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-56bn4"] Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.370106 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.382476 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.382648 4903 scope.go:117] "RemoveContainer" containerID="386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.383165 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2\": container with ID starting with 386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2 not found: ID does not exist" containerID="386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.383198 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2"} err="failed to get container status \"386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2\": rpc error: code = NotFound desc = could not find container \"386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2\": container with ID starting with 386de03bb82a797be770d2b2659f1cc5692f78f475e3093729522f4425bfcab2 not found: ID does not exist" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.383220 4903 scope.go:117] "RemoveContainer" containerID="87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.383481 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b\": container with ID starting with 87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b not found: ID does not exist" containerID="87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.383504 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b"} err="failed to get container status \"87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b\": rpc error: code = NotFound desc = could not find container \"87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b\": container with ID starting with 87c85c61161fb7331d2c712fe1b022235b32f8c23a7603b13ac75a55bd762f6b not found: ID does not exist" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.383516 4903 scope.go:117] "RemoveContainer" containerID="2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.446957 4903 scope.go:117] "RemoveContainer" containerID="55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.477280 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-combined-ca-bundle\") pod \"5e960802-5c0e-4800-853f-e23466958aec\" (UID: \"5e960802-5c0e-4800-853f-e23466958aec\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.477835 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts\") pod \"root-account-create-update-56bn4\" (UID: \"91bf045e-8b49-48df-b43e-9040bb6b2ca5\") " pod="openstack/root-account-create-update-56bn4" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.477921 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t2th\" (UniqueName: \"kubernetes.io/projected/91bf045e-8b49-48df-b43e-9040bb6b2ca5-kube-api-access-9t2th\") pod \"root-account-create-update-56bn4\" (UID: \"91bf045e-8b49-48df-b43e-9040bb6b2ca5\") " pod="openstack/root-account-create-update-56bn4" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.484094 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e960802-5c0e-4800-853f-e23466958aec" (UID: "5e960802-5c0e-4800-853f-e23466958aec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.487206 4903 scope.go:117] "RemoveContainer" containerID="2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.487757 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a\": container with ID starting with 2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a not found: ID does not exist" containerID="2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.487785 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a"} err="failed to get container status \"2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a\": rpc error: code = NotFound desc = could not find container \"2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a\": container with ID starting with 2ef5305e3831d627a2b55f9057395c70bf2be1d808f10700fa8c205c332a281a not found: ID does not exist" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.487807 4903 scope.go:117] "RemoveContainer" containerID="55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.488052 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188\": container with ID starting with 55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188 not found: ID does not exist" containerID="55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.488086 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188"} err="failed to get container status \"55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188\": rpc error: code = NotFound desc = could not find container \"55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188\": container with ID starting with 55fee6c504c86ad76cd2158cdbd3b9a64409472514793f469bb47804fa917188 not found: ID does not exist" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.488100 4903 scope.go:117] "RemoveContainer" containerID="9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.523789 4903 scope.go:117] "RemoveContainer" containerID="9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.524434 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077\": container with ID starting with 9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077 not found: ID does not exist" containerID="9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.524500 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077"} err="failed to get container status \"9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077\": rpc error: code = NotFound desc = could not find container \"9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077\": container with ID starting with 9e4b7aea89c42ee0018766efdc81e4728d5aebd15f02ee33fd42c1b099309077 not found: ID does not exist" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.553529 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25f9-account-create-update-9n49h" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.579766 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t2th\" (UniqueName: \"kubernetes.io/projected/91bf045e-8b49-48df-b43e-9040bb6b2ca5-kube-api-access-9t2th\") pod \"root-account-create-update-56bn4\" (UID: \"91bf045e-8b49-48df-b43e-9040bb6b2ca5\") " pod="openstack/root-account-create-update-56bn4" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.579976 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts\") pod \"root-account-create-update-56bn4\" (UID: \"91bf045e-8b49-48df-b43e-9040bb6b2ca5\") " pod="openstack/root-account-create-update-56bn4" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.580170 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e960802-5c0e-4800-853f-e23466958aec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.580618 4903 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.580720 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts podName:f8c76743-cd0d-48d8-940e-a5e750bd1fcc nodeName:}" failed. No retries permitted until 2026-03-20 08:49:44.580698309 +0000 UTC m=+1609.797598634 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts") pod "root-account-create-update-87nxt" (UID: "f8c76743-cd0d-48d8-940e-a5e750bd1fcc") : configmap "openstack-cell1-scripts" not found Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.584194 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts\") pod \"root-account-create-update-56bn4\" (UID: \"91bf045e-8b49-48df-b43e-9040bb6b2ca5\") " pod="openstack/root-account-create-update-56bn4" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.599925 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t2th\" (UniqueName: \"kubernetes.io/projected/91bf045e-8b49-48df-b43e-9040bb6b2ca5-kube-api-access-9t2th\") pod \"root-account-create-update-56bn4\" (UID: \"91bf045e-8b49-48df-b43e-9040bb6b2ca5\") " pod="openstack/root-account-create-update-56bn4" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.632753 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-87nxt" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.677068 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-56bn4" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.686426 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc6a13a-9844-4e28-93e6-45025f1385a9-operator-scripts\") pod \"2dc6a13a-9844-4e28-93e6-45025f1385a9\" (UID: \"2dc6a13a-9844-4e28-93e6-45025f1385a9\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.686656 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts\") pod \"f8c76743-cd0d-48d8-940e-a5e750bd1fcc\" (UID: \"f8c76743-cd0d-48d8-940e-a5e750bd1fcc\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.687525 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8c76743-cd0d-48d8-940e-a5e750bd1fcc" (UID: "f8c76743-cd0d-48d8-940e-a5e750bd1fcc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.687538 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dc6a13a-9844-4e28-93e6-45025f1385a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2dc6a13a-9844-4e28-93e6-45025f1385a9" (UID: "2dc6a13a-9844-4e28-93e6-45025f1385a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.691796 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75hxd\" (UniqueName: \"kubernetes.io/projected/2dc6a13a-9844-4e28-93e6-45025f1385a9-kube-api-access-75hxd\") pod \"2dc6a13a-9844-4e28-93e6-45025f1385a9\" (UID: \"2dc6a13a-9844-4e28-93e6-45025f1385a9\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.691866 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rxm7\" (UniqueName: \"kubernetes.io/projected/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-kube-api-access-6rxm7\") pod \"f8c76743-cd0d-48d8-940e-a5e750bd1fcc\" (UID: \"f8c76743-cd0d-48d8-940e-a5e750bd1fcc\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.696106 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-69dc7db475-m968g"] Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.701025 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-kube-api-access-6rxm7" (OuterVolumeSpecName: "kube-api-access-6rxm7") pod "f8c76743-cd0d-48d8-940e-a5e750bd1fcc" (UID: "f8c76743-cd0d-48d8-940e-a5e750bd1fcc"). InnerVolumeSpecName "kube-api-access-6rxm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.701773 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc6a13a-9844-4e28-93e6-45025f1385a9-kube-api-access-75hxd" (OuterVolumeSpecName: "kube-api-access-75hxd") pod "2dc6a13a-9844-4e28-93e6-45025f1385a9" (UID: "2dc6a13a-9844-4e28-93e6-45025f1385a9"). InnerVolumeSpecName "kube-api-access-75hxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.693138 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.706640 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dc6a13a-9844-4e28-93e6-45025f1385a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.718234 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-69dc7db475-m968g"] Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.771945 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wdtrn_7bbbd0a7-f915-4197-bde8-4f96590c454f/ovn-controller/0.log" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.772475 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdtrn" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.815797 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75hxd\" (UniqueName: \"kubernetes.io/projected/2dc6a13a-9844-4e28-93e6-45025f1385a9-kube-api-access-75hxd\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.815844 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rxm7\" (UniqueName: \"kubernetes.io/projected/f8c76743-cd0d-48d8-940e-a5e750bd1fcc-kube-api-access-6rxm7\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.916681 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-ovn-controller-tls-certs\") pod \"7bbbd0a7-f915-4197-bde8-4f96590c454f\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.916759 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-log-ovn\") pod \"7bbbd0a7-f915-4197-bde8-4f96590c454f\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.916781 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run-ovn\") pod \"7bbbd0a7-f915-4197-bde8-4f96590c454f\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.916949 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-combined-ca-bundle\") pod \"7bbbd0a7-f915-4197-bde8-4f96590c454f\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.916985 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bbbd0a7-f915-4197-bde8-4f96590c454f-scripts\") pod \"7bbbd0a7-f915-4197-bde8-4f96590c454f\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.917022 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run\") pod \"7bbbd0a7-f915-4197-bde8-4f96590c454f\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.917060 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv6r7\" (UniqueName: \"kubernetes.io/projected/7bbbd0a7-f915-4197-bde8-4f96590c454f-kube-api-access-nv6r7\") pod \"7bbbd0a7-f915-4197-bde8-4f96590c454f\" (UID: \"7bbbd0a7-f915-4197-bde8-4f96590c454f\") " Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.917373 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7bbbd0a7-f915-4197-bde8-4f96590c454f" (UID: "7bbbd0a7-f915-4197-bde8-4f96590c454f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.917514 4903 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 20 08:49:42 crc kubenswrapper[4903]: E0320 08:49:42.917569 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data podName:888a3fd9-01f8-47b3-b1bb-f2b8b6b96509 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:46.917553016 +0000 UTC m=+1612.134453331 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data") pod "rabbitmq-server-0" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509") : configmap "rabbitmq-config-data" not found Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.918513 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7bbbd0a7-f915-4197-bde8-4f96590c454f" (UID: "7bbbd0a7-f915-4197-bde8-4f96590c454f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.919691 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bbbd0a7-f915-4197-bde8-4f96590c454f-scripts" (OuterVolumeSpecName: "scripts") pod "7bbbd0a7-f915-4197-bde8-4f96590c454f" (UID: "7bbbd0a7-f915-4197-bde8-4f96590c454f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.923712 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run" (OuterVolumeSpecName: "var-run") pod "7bbbd0a7-f915-4197-bde8-4f96590c454f" (UID: "7bbbd0a7-f915-4197-bde8-4f96590c454f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.925261 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbbd0a7-f915-4197-bde8-4f96590c454f-kube-api-access-nv6r7" (OuterVolumeSpecName: "kube-api-access-nv6r7") pod "7bbbd0a7-f915-4197-bde8-4f96590c454f" (UID: "7bbbd0a7-f915-4197-bde8-4f96590c454f"). InnerVolumeSpecName "kube-api-access-nv6r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:42 crc kubenswrapper[4903]: I0320 08:49:42.959290 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bbbd0a7-f915-4197-bde8-4f96590c454f" (UID: "7bbbd0a7-f915-4197-bde8-4f96590c454f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.021739 4903 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.022423 4903 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.022496 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.022575 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7bbbd0a7-f915-4197-bde8-4f96590c454f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.022651 4903 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7bbbd0a7-f915-4197-bde8-4f96590c454f-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.022749 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv6r7\" (UniqueName: \"kubernetes.io/projected/7bbbd0a7-f915-4197-bde8-4f96590c454f-kube-api-access-nv6r7\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.030417 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "7bbbd0a7-f915-4197-bde8-4f96590c454f" (UID: "7bbbd0a7-f915-4197-bde8-4f96590c454f"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.082397 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-669bcbb856-w87fq" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.172:8778/\": read tcp 10.217.0.2:58858->10.217.0.172:8778: read: connection reset by peer" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.083155 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-669bcbb856-w87fq" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.172:8778/\": read tcp 10.217.0.2:58856->10.217.0.172:8778: read: connection reset by peer" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.088176 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25f9-account-create-update-9n49h" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.088183 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25f9-account-create-update-9n49h" event={"ID":"2dc6a13a-9844-4e28-93e6-45025f1385a9","Type":"ContainerDied","Data":"d98eba0071d357caab9e9fa49a5697d3236ce77862c464f07fff3b39743b7c7b"} Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.098865 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c33b2cd-e705-41cd-9e59-3dcbb0a55829","Type":"ContainerDied","Data":"036996cd6559515dd00dacbdb33836ee1417dee1983310445b9597aed681a5db"} Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.100466 4903 generic.go:334] "Generic (PLEG): container finished" podID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" containerID="036996cd6559515dd00dacbdb33836ee1417dee1983310445b9597aed681a5db" exitCode=0 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.113822 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-87nxt" event={"ID":"f8c76743-cd0d-48d8-940e-a5e750bd1fcc","Type":"ContainerDied","Data":"d86832d4b747a1532b2b93a3830dc91d0c0eb7a7b3779e4b1452ef30186095dc"} Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.113859 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-87nxt" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.122005 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wdtrn_7bbbd0a7-f915-4197-bde8-4f96590c454f/ovn-controller/0.log" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.122928 4903 generic.go:334] "Generic (PLEG): container finished" podID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerID="2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f" exitCode=143 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.125738 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbbd0a7-f915-4197-bde8-4f96590c454f-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.127800 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdtrn" event={"ID":"7bbbd0a7-f915-4197-bde8-4f96590c454f","Type":"ContainerDied","Data":"2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f"} Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.127911 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wdtrn" event={"ID":"7bbbd0a7-f915-4197-bde8-4f96590c454f","Type":"ContainerDied","Data":"d4340d060ecd66c00269d5a245a982ed785b2362f4a3fc754d476cc61ccbc868"} Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.128000 4903 scope.go:117] "RemoveContainer" containerID="2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.128703 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wdtrn" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.157573 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.158033 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="ceilometer-central-agent" containerID="cri-o://a8e86b614ac7f5338fb12a670fd38985e9a2c2f7e4f4a111a46686a3317b5790" gracePeriod=30 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.158655 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="proxy-httpd" containerID="cri-o://fe129b49b118ce4ae8579680e5a1ff682ff67dcbd9c4f54a13ce05fb9eced8cd" gracePeriod=30 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.158750 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="sg-core" containerID="cri-o://3eb344be279e7505b3a1ee366d11602b2711ebf94ecb9c7d600238fd0ec65c58" gracePeriod=30 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.158816 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="ceilometer-notification-agent" containerID="cri-o://d273844594da563a28c607f77733e2141f72daa05b9f296500d2582410c90a0f" gracePeriod=30 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.159011 4903 generic.go:334] "Generic (PLEG): container finished" podID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerID="fe164d29a102e1f393cadf6ed7a51abeefbf7ba5309bfd408ab8c2520f8f2829" exitCode=0 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.159122 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"877f943b-808c-435e-a5cf-bda8ea0a5d15","Type":"ContainerDied","Data":"fe164d29a102e1f393cadf6ed7a51abeefbf7ba5309bfd408ab8c2520f8f2829"} Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.197235 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.197657 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="f8bb60e5-f963-44ed-9e5e-76ca6da5c723" containerName="kube-state-metrics" containerID="cri-o://99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa" gracePeriod=30 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.214532 4903 generic.go:334] "Generic (PLEG): container finished" podID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerID="625122aba923f3f4672abe1d898433060719b38b2ba824aec7bed77ddaa609eb" exitCode=0 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.214605 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f5f160c-29e2-43d0-bb55-6969904b3a4e","Type":"ContainerDied","Data":"625122aba923f3f4672abe1d898433060719b38b2ba824aec7bed77ddaa609eb"} Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.310110 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-56bn4"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.325919 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.326233 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="34de9984-0547-4ba1-ae7d-5f8cc9196c26" containerName="memcached" containerID="cri-o://b0e941c6eb837ac5cb4a6c3fdea1d2d35578b2c2d7f9749cd5b2c8e0e7710ff8" gracePeriod=30 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.363615 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7068-account-create-update-65l92"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.376544 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7068-account-create-update-65l92"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.400382 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7068-account-create-update-hqkgt"] Mar 20 08:49:43 crc kubenswrapper[4903]: E0320 08:49:43.400947 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerName="ovn-controller" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.400963 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerName="ovn-controller" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.408026 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" containerName="ovn-controller" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.427382 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kt4gk"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.427519 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.425782 4903 scope.go:117] "RemoveContainer" containerID="2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f" Mar 20 08:49:43 crc kubenswrapper[4903]: E0320 08:49:43.462810 4903 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 08:49:43 crc kubenswrapper[4903]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 08:49:43 crc kubenswrapper[4903]: Mar 20 08:49:43 crc kubenswrapper[4903]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 08:49:43 crc kubenswrapper[4903]: Mar 20 08:49:43 crc kubenswrapper[4903]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 08:49:43 crc kubenswrapper[4903]: Mar 20 08:49:43 crc kubenswrapper[4903]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 08:49:43 crc kubenswrapper[4903]: Mar 20 08:49:43 crc kubenswrapper[4903]: if [ -n "" ]; then Mar 20 08:49:43 crc kubenswrapper[4903]: GRANT_DATABASE="" Mar 20 08:49:43 crc kubenswrapper[4903]: else Mar 20 08:49:43 crc kubenswrapper[4903]: GRANT_DATABASE="*" Mar 20 08:49:43 crc kubenswrapper[4903]: fi Mar 20 08:49:43 crc kubenswrapper[4903]: Mar 20 08:49:43 crc kubenswrapper[4903]: # going for maximum compatibility here: Mar 20 08:49:43 crc kubenswrapper[4903]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 08:49:43 crc kubenswrapper[4903]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 08:49:43 crc kubenswrapper[4903]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 08:49:43 crc kubenswrapper[4903]: # support updates Mar 20 08:49:43 crc kubenswrapper[4903]: Mar 20 08:49:43 crc kubenswrapper[4903]: $MYSQL_CMD < logger="UnhandledError" Mar 20 08:49:43 crc kubenswrapper[4903]: E0320 08:49:43.463905 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f\": container with ID starting with 2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f not found: ID does not exist" containerID="2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.463943 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f"} err="failed to get container status \"2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f\": rpc error: code = NotFound desc = could not find container \"2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f\": container with ID starting with 2c58468caf97e64984d8d14d363823a209ce66a3d49d8180701d12dee64c3d5f not found: ID does not exist" Mar 20 08:49:43 crc kubenswrapper[4903]: E0320 08:49:43.463995 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-56bn4" podUID="91bf045e-8b49-48df-b43e-9040bb6b2ca5" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.465997 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.493896 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n6bt8"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.575163 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f22901-e7ea-44d3-bc58-7efffaa493ad" path="/var/lib/kubelet/pods/01f22901-e7ea-44d3-bc58-7efffaa493ad/volumes" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.575849 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cdd4833-7200-46c0-9bb4-1b18c7828044" path="/var/lib/kubelet/pods/3cdd4833-7200-46c0-9bb4-1b18c7828044/volumes" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.576585 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="548096cf-0b33-4f2f-b8be-7d1ac859cf7c" path="/var/lib/kubelet/pods/548096cf-0b33-4f2f-b8be-7d1ac859cf7c/volumes" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.584189 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e960802-5c0e-4800-853f-e23466958aec" path="/var/lib/kubelet/pods/5e960802-5c0e-4800-853f-e23466958aec/volumes" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.585027 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e4027bc-3929-4b8b-9538-ab67f779558c" path="/var/lib/kubelet/pods/6e4027bc-3929-4b8b-9538-ab67f779558c/volumes" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.585203 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm42s\" (UniqueName: \"kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s\") pod \"keystone-7068-account-create-update-hqkgt\" (UID: \"69bc9139-cd82-4fa2-847f-b831c080d163\") " pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.585344 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts\") pod \"keystone-7068-account-create-update-hqkgt\" (UID: \"69bc9139-cd82-4fa2-847f-b831c080d163\") " pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.585784 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f390d60f-9967-4869-b09f-3cea4570186e" path="/var/lib/kubelet/pods/f390d60f-9967-4869-b09f-3cea4570186e/volumes" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.595831 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff49346f-602e-46f6-91c7-9c1966535720" path="/var/lib/kubelet/pods/ff49346f-602e-46f6-91c7-9c1966535720/volumes" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.597209 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7068-account-create-update-hqkgt"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.620887 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wdtrn"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.627502 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kt4gk"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.634789 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n6bt8"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.645598 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.184:8776/healthcheck\": read tcp 10.217.0.2:43714->10.217.0.184:8776: read: connection reset by peer" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.649404 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wdtrn"] Mar 20 08:49:43 crc kubenswrapper[4903]: E0320 08:49:43.654540 4903 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dcd96a1_71bb_480c_8387_0fca4d17bf33.slice/crio-10f2b77e4d99df665d6349b311dc9b4cfe076067c636391f3d7e6e34202c3750.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f5f160c_29e2_43d0_bb55_6969904b3a4e.slice/crio-conmon-625122aba923f3f4672abe1d898433060719b38b2ba824aec7bed77ddaa609eb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dc6a13a_9844_4e28_93e6_45025f1385a9.slice/crio-d98eba0071d357caab9e9fa49a5697d3236ce77862c464f07fff3b39743b7c7b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f5f160c_29e2_43d0_bb55_6969904b3a4e.slice/crio-625122aba923f3f4672abe1d898433060719b38b2ba824aec7bed77ddaa609eb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dcd96a1_71bb_480c_8387_0fca4d17bf33.slice/crio-conmon-10f2b77e4d99df665d6349b311dc9b4cfe076067c636391f3d7e6e34202c3750.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8bb60e5_f963_44ed_9e5e_76ca6da5c723.slice/crio-99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8bb60e5_f963_44ed_9e5e_76ca6da5c723.slice/crio-conmon-99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8005d467_6a20_4e68_b62f_65ad97a31812.slice/crio-conmon-3eb344be279e7505b3a1ee366d11602b2711ebf94ecb9c7d600238fd0ec65c58.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8c76743_cd0d_48d8_940e_a5e750bd1fcc.slice/crio-d86832d4b747a1532b2b93a3830dc91d0c0eb7a7b3779e4b1452ef30186095dc\": RecentStats: unable to find data in memory cache]" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.658582 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7f784d4489-rxkmk"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.658893 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7f784d4489-rxkmk" podUID="c94a513f-1b70-4705-af6c-3f71cb0e4272" containerName="keystone-api" containerID="cri-o://9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d" gracePeriod=30 Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.673190 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.677168 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5895dcfdfd-4gs9b" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:45800->10.217.0.164:9311: read: connection reset by peer" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.677570 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5895dcfdfd-4gs9b" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:45798->10.217.0.164:9311: read: connection reset by peer" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.677946 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-m4hwk"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.690100 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7068-account-create-update-hqkgt"] Mar 20 08:49:43 crc kubenswrapper[4903]: E0320 08:49:43.691176 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-xm42s operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-7068-account-create-update-hqkgt" podUID="69bc9139-cd82-4fa2-847f-b831c080d163" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.692948 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm42s\" (UniqueName: \"kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s\") pod \"keystone-7068-account-create-update-hqkgt\" (UID: \"69bc9139-cd82-4fa2-847f-b831c080d163\") " pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.693012 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts\") pod \"keystone-7068-account-create-update-hqkgt\" (UID: \"69bc9139-cd82-4fa2-847f-b831c080d163\") " pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:43 crc kubenswrapper[4903]: E0320 08:49:43.693300 4903 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 08:49:43 crc kubenswrapper[4903]: E0320 08:49:43.693362 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts podName:69bc9139-cd82-4fa2-847f-b831c080d163 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:44.193340245 +0000 UTC m=+1609.410240560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts") pod "keystone-7068-account-create-update-hqkgt" (UID: "69bc9139-cd82-4fa2-847f-b831c080d163") : configmap "openstack-scripts" not found Mar 20 08:49:43 crc kubenswrapper[4903]: E0320 08:49:43.703150 4903 projected.go:194] Error preparing data for projected volume kube-api-access-xm42s for pod openstack/keystone-7068-account-create-update-hqkgt: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 08:49:43 crc kubenswrapper[4903]: E0320 08:49:43.703262 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s podName:69bc9139-cd82-4fa2-847f-b831c080d163 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:44.203238408 +0000 UTC m=+1609.420138723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xm42s" (UniqueName: "kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s") pod "keystone-7068-account-create-update-hqkgt" (UID: "69bc9139-cd82-4fa2-847f-b831c080d163") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.715408 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-m4hwk"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.754562 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-25f9-account-create-update-9n49h"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.762828 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-25f9-account-create-update-9n49h"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.776835 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-56bn4"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.808246 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.813749 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-87nxt"] Mar 20 08:49:43 crc kubenswrapper[4903]: I0320 08:49:43.829439 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-87nxt"] Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.012948 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-config-data\") pod \"877f943b-808c-435e-a5cf-bda8ea0a5d15\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.013066 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-842f2\" (UniqueName: \"kubernetes.io/projected/877f943b-808c-435e-a5cf-bda8ea0a5d15-kube-api-access-842f2\") pod \"877f943b-808c-435e-a5cf-bda8ea0a5d15\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.013162 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-public-tls-certs\") pod \"877f943b-808c-435e-a5cf-bda8ea0a5d15\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.013257 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877f943b-808c-435e-a5cf-bda8ea0a5d15-logs\") pod \"877f943b-808c-435e-a5cf-bda8ea0a5d15\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.013292 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-internal-tls-certs\") pod \"877f943b-808c-435e-a5cf-bda8ea0a5d15\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.013446 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-combined-ca-bundle\") pod \"877f943b-808c-435e-a5cf-bda8ea0a5d15\" (UID: \"877f943b-808c-435e-a5cf-bda8ea0a5d15\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.013999 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877f943b-808c-435e-a5cf-bda8ea0a5d15-logs" (OuterVolumeSpecName: "logs") pod "877f943b-808c-435e-a5cf-bda8ea0a5d15" (UID: "877f943b-808c-435e-a5cf-bda8ea0a5d15"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.015806 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877f943b-808c-435e-a5cf-bda8ea0a5d15-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.028326 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877f943b-808c-435e-a5cf-bda8ea0a5d15-kube-api-access-842f2" (OuterVolumeSpecName: "kube-api-access-842f2") pod "877f943b-808c-435e-a5cf-bda8ea0a5d15" (UID: "877f943b-808c-435e-a5cf-bda8ea0a5d15"). InnerVolumeSpecName "kube-api-access-842f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.083359 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.095931 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-config-data" (OuterVolumeSpecName: "config-data") pod "877f943b-808c-435e-a5cf-bda8ea0a5d15" (UID: "877f943b-808c-435e-a5cf-bda8ea0a5d15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.098777 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.109528 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "877f943b-808c-435e-a5cf-bda8ea0a5d15" (UID: "877f943b-808c-435e-a5cf-bda8ea0a5d15"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.120761 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.120788 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.120798 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-842f2\" (UniqueName: \"kubernetes.io/projected/877f943b-808c-435e-a5cf-bda8ea0a5d15-kube-api-access-842f2\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.124008 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="96a68183-d440-4f89-887d-d2441d00c8e4" containerName="galera" containerID="cri-o://b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9" gracePeriod=30 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.139293 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "877f943b-808c-435e-a5cf-bda8ea0a5d15" (UID: "877f943b-808c-435e-a5cf-bda8ea0a5d15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.170219 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.175282 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.178992 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.179032 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dc127483-5a42-4eea-8b8c-8a1382dced05" containerName="nova-scheduler-scheduler" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.190394 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "877f943b-808c-435e-a5cf-bda8ea0a5d15" (UID: "877f943b-808c-435e-a5cf-bda8ea0a5d15"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.221778 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-config-data\") pod \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.221867 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-nova-metadata-tls-certs\") pod \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.221922 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-httpd-run\") pod \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.221943 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzzqn\" (UniqueName: \"kubernetes.io/projected/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-kube-api-access-vzzqn\") pod \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222017 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-logs\") pod \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222079 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-scripts\") pod \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222114 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222164 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-config-data\") pod \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222185 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x7wx\" (UniqueName: \"kubernetes.io/projected/7f5f160c-29e2-43d0-bb55-6969904b3a4e-kube-api-access-6x7wx\") pod \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222207 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-public-tls-certs\") pod \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222232 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f5f160c-29e2-43d0-bb55-6969904b3a4e-logs\") pod \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222262 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-combined-ca-bundle\") pod \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\" (UID: \"7f5f160c-29e2-43d0-bb55-6969904b3a4e\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222288 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-combined-ca-bundle\") pod \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\" (UID: \"2c33b2cd-e705-41cd-9e59-3dcbb0a55829\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222492 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm42s\" (UniqueName: \"kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s\") pod \"keystone-7068-account-create-update-hqkgt\" (UID: \"69bc9139-cd82-4fa2-847f-b831c080d163\") " pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222531 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts\") pod \"keystone-7068-account-create-update-hqkgt\" (UID: \"69bc9139-cd82-4fa2-847f-b831c080d163\") " pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222695 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.222707 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/877f943b-808c-435e-a5cf-bda8ea0a5d15-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.222769 4903 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.222823 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts podName:69bc9139-cd82-4fa2-847f-b831c080d163 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:45.222804645 +0000 UTC m=+1610.439704960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts") pod "keystone-7068-account-create-update-hqkgt" (UID: "69bc9139-cd82-4fa2-847f-b831c080d163") : configmap "openstack-scripts" not found Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.224685 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-logs" (OuterVolumeSpecName: "logs") pod "2c33b2cd-e705-41cd-9e59-3dcbb0a55829" (UID: "2c33b2cd-e705-41cd-9e59-3dcbb0a55829"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.225116 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2c33b2cd-e705-41cd-9e59-3dcbb0a55829" (UID: "2c33b2cd-e705-41cd-9e59-3dcbb0a55829"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.229519 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f5f160c-29e2-43d0-bb55-6969904b3a4e-logs" (OuterVolumeSpecName: "logs") pod "7f5f160c-29e2-43d0-bb55-6969904b3a4e" (UID: "7f5f160c-29e2-43d0-bb55-6969904b3a4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.231057 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "2c33b2cd-e705-41cd-9e59-3dcbb0a55829" (UID: "2c33b2cd-e705-41cd-9e59-3dcbb0a55829"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.232549 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5f160c-29e2-43d0-bb55-6969904b3a4e-kube-api-access-6x7wx" (OuterVolumeSpecName: "kube-api-access-6x7wx") pod "7f5f160c-29e2-43d0-bb55-6969904b3a4e" (UID: "7f5f160c-29e2-43d0-bb55-6969904b3a4e"). InnerVolumeSpecName "kube-api-access-6x7wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.233023 4903 projected.go:194] Error preparing data for projected volume kube-api-access-xm42s for pod openstack/keystone-7068-account-create-update-hqkgt: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.233107 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s podName:69bc9139-cd82-4fa2-847f-b831c080d163 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:45.233082197 +0000 UTC m=+1610.449982512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xm42s" (UniqueName: "kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s") pod "keystone-7068-account-create-update-hqkgt" (UID: "69bc9139-cd82-4fa2-847f-b831c080d163") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.233713 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-scripts" (OuterVolumeSpecName: "scripts") pod "2c33b2cd-e705-41cd-9e59-3dcbb0a55829" (UID: "2c33b2cd-e705-41cd-9e59-3dcbb0a55829"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.238073 4903 generic.go:334] "Generic (PLEG): container finished" podID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" containerID="cb7bb133830dc4bdac2cbea0a778366365297bc4c7bc623ae8334776f5496711" exitCode=0 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.238398 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a","Type":"ContainerDied","Data":"cb7bb133830dc4bdac2cbea0a778366365297bc4c7bc623ae8334776f5496711"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.246268 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-kube-api-access-vzzqn" (OuterVolumeSpecName: "kube-api-access-vzzqn") pod "2c33b2cd-e705-41cd-9e59-3dcbb0a55829" (UID: "2c33b2cd-e705-41cd-9e59-3dcbb0a55829"). InnerVolumeSpecName "kube-api-access-vzzqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.252689 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f5f160c-29e2-43d0-bb55-6969904b3a4e","Type":"ContainerDied","Data":"01038cff4a0bccaa415590843a6df02825138f5efbcb14fc952063ab9e9f4fe6"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.252741 4903 scope.go:117] "RemoveContainer" containerID="625122aba923f3f4672abe1d898433060719b38b2ba824aec7bed77ddaa609eb" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.253098 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.260962 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c33b2cd-e705-41cd-9e59-3dcbb0a55829" (UID: "2c33b2cd-e705-41cd-9e59-3dcbb0a55829"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.270228 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152 is running failed: container process not found" containerID="9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.270753 4903 generic.go:334] "Generic (PLEG): container finished" podID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerID="10f2b77e4d99df665d6349b311dc9b4cfe076067c636391f3d7e6e34202c3750" exitCode=0 Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.270791 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152 is running failed: container process not found" containerID="9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.270824 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-669bcbb856-w87fq" event={"ID":"1dcd96a1-71bb-480c-8387-0fca4d17bf33","Type":"ContainerDied","Data":"10f2b77e4d99df665d6349b311dc9b4cfe076067c636391f3d7e6e34202c3750"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.270850 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-669bcbb856-w87fq" event={"ID":"1dcd96a1-71bb-480c-8387-0fca4d17bf33","Type":"ContainerDied","Data":"265057ba6e83fa8c78566dd69b864204ce6db94495ba7162cf5ee90e9aac31a6"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.270862 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="265057ba6e83fa8c78566dd69b864204ce6db94495ba7162cf5ee90e9aac31a6" Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.271467 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152 is running failed: container process not found" containerID="9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.271570 4903 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="5e072c5e-0f44-4d24-bccc-b14bf61fa192" containerName="nova-cell1-conductor-conductor" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.273359 4903 generic.go:334] "Generic (PLEG): container finished" podID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerID="dc39193e0b3efc7d58828ef8c691abe141cb7d978ac576bd15dc6776a511b5fe" exitCode=0 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.273485 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"50b5adcb-aed8-4cff-b3ec-02721df3937d","Type":"ContainerDied","Data":"dc39193e0b3efc7d58828ef8c691abe141cb7d978ac576bd15dc6776a511b5fe"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.279774 4903 generic.go:334] "Generic (PLEG): container finished" podID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerID="399861c0aed674ac17caf4a76e74022608a67747c7b5e6718fd6d0bf4376c5d8" exitCode=0 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.279856 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5895dcfdfd-4gs9b" event={"ID":"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33","Type":"ContainerDied","Data":"399861c0aed674ac17caf4a76e74022608a67747c7b5e6718fd6d0bf4376c5d8"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.282463 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7f5f160c-29e2-43d0-bb55-6969904b3a4e" (UID: "7f5f160c-29e2-43d0-bb55-6969904b3a4e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.285223 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.287454 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"877f943b-808c-435e-a5cf-bda8ea0a5d15","Type":"ContainerDied","Data":"7df4dbd4badba2ed9e91ed3d3e81edb212a9e3001550c6c9a49714877d7a7017"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.299652 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2c33b2cd-e705-41cd-9e59-3dcbb0a55829" (UID: "2c33b2cd-e705-41cd-9e59-3dcbb0a55829"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.306288 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.307724 4903 generic.go:334] "Generic (PLEG): container finished" podID="f8bb60e5-f963-44ed-9e5e-76ca6da5c723" containerID="99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa" exitCode=2 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.307781 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8bb60e5-f963-44ed-9e5e-76ca6da5c723","Type":"ContainerDied","Data":"99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.307804 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8bb60e5-f963-44ed-9e5e-76ca6da5c723","Type":"ContainerDied","Data":"67121147e76913a52a7a4bc86daef660e583d5da1143d57f716b842a0e33a1d9"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.310178 4903 generic.go:334] "Generic (PLEG): container finished" podID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" containerID="8c4db124274a0f1bc8c0540b96509dad9dbb16ca409f3c709b225f29f322a39d" exitCode=0 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.310217 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679","Type":"ContainerDied","Data":"8c4db124274a0f1bc8c0540b96509dad9dbb16ca409f3c709b225f29f322a39d"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.311183 4903 generic.go:334] "Generic (PLEG): container finished" podID="5e072c5e-0f44-4d24-bccc-b14bf61fa192" containerID="9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152" exitCode=0 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.311215 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5e072c5e-0f44-4d24-bccc-b14bf61fa192","Type":"ContainerDied","Data":"9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.319769 4903 generic.go:334] "Generic (PLEG): container finished" podID="8005d467-6a20-4e68-b62f-65ad97a31812" containerID="fe129b49b118ce4ae8579680e5a1ff682ff67dcbd9c4f54a13ce05fb9eced8cd" exitCode=0 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.319798 4903 generic.go:334] "Generic (PLEG): container finished" podID="8005d467-6a20-4e68-b62f-65ad97a31812" containerID="3eb344be279e7505b3a1ee366d11602b2711ebf94ecb9c7d600238fd0ec65c58" exitCode=2 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.319808 4903 generic.go:334] "Generic (PLEG): container finished" podID="8005d467-6a20-4e68-b62f-65ad97a31812" containerID="a8e86b614ac7f5338fb12a670fd38985e9a2c2f7e4f4a111a46686a3317b5790" exitCode=0 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.319861 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8005d467-6a20-4e68-b62f-65ad97a31812","Type":"ContainerDied","Data":"fe129b49b118ce4ae8579680e5a1ff682ff67dcbd9c4f54a13ce05fb9eced8cd"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.319886 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8005d467-6a20-4e68-b62f-65ad97a31812","Type":"ContainerDied","Data":"3eb344be279e7505b3a1ee366d11602b2711ebf94ecb9c7d600238fd0ec65c58"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.319897 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8005d467-6a20-4e68-b62f-65ad97a31812","Type":"ContainerDied","Data":"a8e86b614ac7f5338fb12a670fd38985e9a2c2f7e4f4a111a46686a3317b5790"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326129 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d88b6\" (UniqueName: \"kubernetes.io/projected/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-api-access-d88b6\") pod \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326216 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-combined-ca-bundle\") pod \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326317 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-certs\") pod \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326363 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-config\") pod \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\" (UID: \"f8bb60e5-f963-44ed-9e5e-76ca6da5c723\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326876 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326894 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326919 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326934 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x7wx\" (UniqueName: \"kubernetes.io/projected/7f5f160c-29e2-43d0-bb55-6969904b3a4e-kube-api-access-6x7wx\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326946 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326954 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f5f160c-29e2-43d0-bb55-6969904b3a4e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326962 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326970 4903 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326978 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.326987 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzzqn\" (UniqueName: \"kubernetes.io/projected/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-kube-api-access-vzzqn\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.340898 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.341397 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c33b2cd-e705-41cd-9e59-3dcbb0a55829","Type":"ContainerDied","Data":"915ba21af04025013209005f6a27dbcc46d5120084dbd26714f30f808a165887"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.348422 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.351318 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f5f160c-29e2-43d0-bb55-6969904b3a4e" (UID: "7f5f160c-29e2-43d0-bb55-6969904b3a4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.352880 4903 generic.go:334] "Generic (PLEG): container finished" podID="34de9984-0547-4ba1-ae7d-5f8cc9196c26" containerID="b0e941c6eb837ac5cb4a6c3fdea1d2d35578b2c2d7f9749cd5b2c8e0e7710ff8" exitCode=0 Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.352952 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"34de9984-0547-4ba1-ae7d-5f8cc9196c26","Type":"ContainerDied","Data":"b0e941c6eb837ac5cb4a6c3fdea1d2d35578b2c2d7f9749cd5b2c8e0e7710ff8"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.357990 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.359220 4903 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-56bn4" secret="" err="secret \"galera-openstack-dockercfg-x6l5r\" not found" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.359682 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-56bn4" event={"ID":"91bf045e-8b49-48df-b43e-9040bb6b2ca5","Type":"ContainerStarted","Data":"340cf727ff698591f1edf8375d21ef430e1c0d6d29845adf1c391d9354f99a37"} Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.369303 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.376072 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.385303 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-api-access-d88b6" (OuterVolumeSpecName: "kube-api-access-d88b6") pod "f8bb60e5-f963-44ed-9e5e-76ca6da5c723" (UID: "f8bb60e5-f963-44ed-9e5e-76ca6da5c723"). InnerVolumeSpecName "kube-api-access-d88b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.385383 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-config-data" (OuterVolumeSpecName: "config-data") pod "7f5f160c-29e2-43d0-bb55-6969904b3a4e" (UID: "7f5f160c-29e2-43d0-bb55-6969904b3a4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.408429 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.428304 4903 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.428387 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts podName:91bf045e-8b49-48df-b43e-9040bb6b2ca5 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:44.928366934 +0000 UTC m=+1610.145267249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts") pod "root-account-create-update-56bn4" (UID: "91bf045e-8b49-48df-b43e-9040bb6b2ca5") : configmap "openstack-scripts" not found Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.429675 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.429702 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d88b6\" (UniqueName: \"kubernetes.io/projected/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-api-access-d88b6\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.429713 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5f160c-29e2-43d0-bb55-6969904b3a4e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.443366 4903 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 08:49:44 crc kubenswrapper[4903]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 08:49:44 crc kubenswrapper[4903]: Mar 20 08:49:44 crc kubenswrapper[4903]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 08:49:44 crc kubenswrapper[4903]: Mar 20 08:49:44 crc kubenswrapper[4903]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 08:49:44 crc kubenswrapper[4903]: Mar 20 08:49:44 crc kubenswrapper[4903]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 08:49:44 crc kubenswrapper[4903]: Mar 20 08:49:44 crc kubenswrapper[4903]: if [ -n "" ]; then Mar 20 08:49:44 crc kubenswrapper[4903]: GRANT_DATABASE="" Mar 20 08:49:44 crc kubenswrapper[4903]: else Mar 20 08:49:44 crc kubenswrapper[4903]: GRANT_DATABASE="*" Mar 20 08:49:44 crc kubenswrapper[4903]: fi Mar 20 08:49:44 crc kubenswrapper[4903]: Mar 20 08:49:44 crc kubenswrapper[4903]: # going for maximum compatibility here: Mar 20 08:49:44 crc kubenswrapper[4903]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 08:49:44 crc kubenswrapper[4903]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 08:49:44 crc kubenswrapper[4903]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 08:49:44 crc kubenswrapper[4903]: # support updates Mar 20 08:49:44 crc kubenswrapper[4903]: Mar 20 08:49:44 crc kubenswrapper[4903]: $MYSQL_CMD < logger="UnhandledError" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.443472 4903 scope.go:117] "RemoveContainer" containerID="90c27df17f81d979977526ee2d95fcef8b351eceda695edad6f1dfd00a676827" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.443638 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-config-data" (OuterVolumeSpecName: "config-data") pod "2c33b2cd-e705-41cd-9e59-3dcbb0a55829" (UID: "2c33b2cd-e705-41cd-9e59-3dcbb0a55829"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.445691 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-56bn4" podUID="91bf045e-8b49-48df-b43e-9040bb6b2ca5" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.452159 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "f8bb60e5-f963-44ed-9e5e-76ca6da5c723" (UID: "f8bb60e5-f963-44ed-9e5e-76ca6da5c723"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.474832 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8bb60e5-f963-44ed-9e5e-76ca6da5c723" (UID: "f8bb60e5-f963-44ed-9e5e-76ca6da5c723"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.484230 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.486953 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "f8bb60e5-f963-44ed-9e5e-76ca6da5c723" (UID: "f8bb60e5-f963-44ed-9e5e-76ca6da5c723"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.501376 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.515894 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.539117 4903 scope.go:117] "RemoveContainer" containerID="fe164d29a102e1f393cadf6ed7a51abeefbf7ba5309bfd408ab8c2520f8f2829" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541572 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-public-tls-certs\") pod \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541624 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-combined-ca-bundle\") pod \"50b5adcb-aed8-4cff-b3ec-02721df3937d\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541661 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-internal-tls-certs\") pod \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541693 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b5adcb-aed8-4cff-b3ec-02721df3937d-logs\") pod \"50b5adcb-aed8-4cff-b3ec-02721df3937d\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541712 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data-custom\") pod \"50b5adcb-aed8-4cff-b3ec-02721df3937d\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541755 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data-custom\") pod \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541798 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-886g9\" (UniqueName: \"kubernetes.io/projected/50b5adcb-aed8-4cff-b3ec-02721df3937d-kube-api-access-886g9\") pod \"50b5adcb-aed8-4cff-b3ec-02721df3937d\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541817 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-internal-tls-certs\") pod \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541832 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dcd96a1-71bb-480c-8387-0fca4d17bf33-logs\") pod \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541852 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp88c\" (UniqueName: \"kubernetes.io/projected/1dcd96a1-71bb-480c-8387-0fca4d17bf33-kube-api-access-wp88c\") pod \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541875 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-scripts\") pod \"50b5adcb-aed8-4cff-b3ec-02721df3937d\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541903 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-internal-tls-certs\") pod \"50b5adcb-aed8-4cff-b3ec-02721df3937d\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541927 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-public-tls-certs\") pod \"50b5adcb-aed8-4cff-b3ec-02721df3937d\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.541945 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data\") pod \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542000 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-857hw\" (UniqueName: \"kubernetes.io/projected/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-kube-api-access-857hw\") pod \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542037 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data\") pod \"50b5adcb-aed8-4cff-b3ec-02721df3937d\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542071 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-config-data\") pod \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542106 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-public-tls-certs\") pod \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542141 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-logs\") pod \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542176 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-scripts\") pod \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542213 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-combined-ca-bundle\") pod \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\" (UID: \"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542234 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50b5adcb-aed8-4cff-b3ec-02721df3937d-etc-machine-id\") pod \"50b5adcb-aed8-4cff-b3ec-02721df3937d\" (UID: \"50b5adcb-aed8-4cff-b3ec-02721df3937d\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542250 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-combined-ca-bundle\") pod \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\" (UID: \"1dcd96a1-71bb-480c-8387-0fca4d17bf33\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542632 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542644 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c33b2cd-e705-41cd-9e59-3dcbb0a55829-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542655 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542666 4903 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.542675 4903 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8bb60e5-f963-44ed-9e5e-76ca6da5c723-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.544193 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-logs" (OuterVolumeSpecName: "logs") pod "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" (UID: "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.544873 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50b5adcb-aed8-4cff-b3ec-02721df3937d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "50b5adcb-aed8-4cff-b3ec-02721df3937d" (UID: "50b5adcb-aed8-4cff-b3ec-02721df3937d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.557113 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50b5adcb-aed8-4cff-b3ec-02721df3937d-logs" (OuterVolumeSpecName: "logs") pod "50b5adcb-aed8-4cff-b3ec-02721df3937d" (UID: "50b5adcb-aed8-4cff-b3ec-02721df3937d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.561850 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-scripts" (OuterVolumeSpecName: "scripts") pod "1dcd96a1-71bb-480c-8387-0fca4d17bf33" (UID: "1dcd96a1-71bb-480c-8387-0fca4d17bf33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.561966 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "50b5adcb-aed8-4cff-b3ec-02721df3937d" (UID: "50b5adcb-aed8-4cff-b3ec-02721df3937d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.565260 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b5adcb-aed8-4cff-b3ec-02721df3937d-kube-api-access-886g9" (OuterVolumeSpecName: "kube-api-access-886g9") pod "50b5adcb-aed8-4cff-b3ec-02721df3937d" (UID: "50b5adcb-aed8-4cff-b3ec-02721df3937d"). InnerVolumeSpecName "kube-api-access-886g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.567286 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcd96a1-71bb-480c-8387-0fca4d17bf33-logs" (OuterVolumeSpecName: "logs") pod "1dcd96a1-71bb-480c-8387-0fca4d17bf33" (UID: "1dcd96a1-71bb-480c-8387-0fca4d17bf33"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.569601 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-kube-api-access-857hw" (OuterVolumeSpecName: "kube-api-access-857hw") pod "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" (UID: "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33"). InnerVolumeSpecName "kube-api-access-857hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.577553 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.584083 4903 scope.go:117] "RemoveContainer" containerID="a2260ceafa26704cedbc21056c132c79bd5a3239f4a5a39d01b35eaa39143f86" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.590151 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" (UID: "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.590312 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcd96a1-71bb-480c-8387-0fca4d17bf33-kube-api-access-wp88c" (OuterVolumeSpecName: "kube-api-access-wp88c") pod "1dcd96a1-71bb-480c-8387-0fca4d17bf33" (UID: "1dcd96a1-71bb-480c-8387-0fca4d17bf33"). InnerVolumeSpecName "kube-api-access-wp88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.608200 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-scripts" (OuterVolumeSpecName: "scripts") pod "50b5adcb-aed8-4cff-b3ec-02721df3937d" (UID: "50b5adcb-aed8-4cff-b3ec-02721df3937d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.645546 4903 scope.go:117] "RemoveContainer" containerID="99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.646033 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-internal-tls-certs\") pod \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.646185 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-scripts\") pod \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.646291 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-httpd-run\") pod \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.646393 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq4mh\" (UniqueName: \"kubernetes.io/projected/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-kube-api-access-nq4mh\") pod \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.646426 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-combined-ca-bundle\") pod \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.646475 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.646603 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-logs\") pod \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.646673 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-config-data\") pod \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\" (UID: \"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647316 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-886g9\" (UniqueName: \"kubernetes.io/projected/50b5adcb-aed8-4cff-b3ec-02721df3937d-kube-api-access-886g9\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647341 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dcd96a1-71bb-480c-8387-0fca4d17bf33-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647353 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp88c\" (UniqueName: \"kubernetes.io/projected/1dcd96a1-71bb-480c-8387-0fca4d17bf33-kube-api-access-wp88c\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647366 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647378 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-857hw\" (UniqueName: \"kubernetes.io/projected/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-kube-api-access-857hw\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647389 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647401 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647413 4903 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50b5adcb-aed8-4cff-b3ec-02721df3937d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647425 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b5adcb-aed8-4cff-b3ec-02721df3937d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647437 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.647450 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.682667 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-logs" (OuterVolumeSpecName: "logs") pod "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" (UID: "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.682919 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-scripts" (OuterVolumeSpecName: "scripts") pod "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" (UID: "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.691066 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" (UID: "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.730753 4903 scope.go:117] "RemoveContainer" containerID="99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa" Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.731704 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa\": container with ID starting with 99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa not found: ID does not exist" containerID="99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.731740 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa"} err="failed to get container status \"99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa\": rpc error: code = NotFound desc = could not find container \"99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa\": container with ID starting with 99220c34b143816ada3efc7f99d463db4e8819e68b51ceb37233e0bf8c04a8fa not found: ID does not exist" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.732333 4903 scope.go:117] "RemoveContainer" containerID="036996cd6559515dd00dacbdb33836ee1417dee1983310445b9597aed681a5db" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.731782 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" (UID: "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.743998 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-kube-api-access-nq4mh" (OuterVolumeSpecName: "kube-api-access-nq4mh") pod "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" (UID: "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679"). InnerVolumeSpecName "kube-api-access-nq4mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.751903 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.760628 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.768608 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.768643 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.768659 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq4mh\" (UniqueName: \"kubernetes.io/projected/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-kube-api-access-nq4mh\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.768695 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.768706 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.811908 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.812445 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-config-data" (OuterVolumeSpecName: "config-data") pod "1dcd96a1-71bb-480c-8387-0fca4d17bf33" (UID: "1dcd96a1-71bb-480c-8387-0fca4d17bf33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.812639 4903 scope.go:117] "RemoveContainer" containerID="e7046c98c3546d6698b9ba6c5237b460ed8efe52b7e38c2e607164140f59d3d5" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.821658 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.830803 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.832542 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" (UID: "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.837239 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" (UID: "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.838949 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.852268 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50b5adcb-aed8-4cff-b3ec-02721df3937d" (UID: "50b5adcb-aed8-4cff-b3ec-02721df3937d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.852950 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" (UID: "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.870600 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kolla-config\") pod \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.870760 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-config-data\") pod \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.870807 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfjw5\" (UniqueName: \"kubernetes.io/projected/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kube-api-access-sfjw5\") pod \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.871084 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-memcached-tls-certs\") pod \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.871304 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-combined-ca-bundle\") pod \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\" (UID: \"34de9984-0547-4ba1-ae7d-5f8cc9196c26\") " Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.871466 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "34de9984-0547-4ba1-ae7d-5f8cc9196c26" (UID: "34de9984-0547-4ba1-ae7d-5f8cc9196c26"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.871539 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-config-data" (OuterVolumeSpecName: "config-data") pod "34de9984-0547-4ba1-ae7d-5f8cc9196c26" (UID: "34de9984-0547-4ba1-ae7d-5f8cc9196c26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.872519 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.872550 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.872562 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.872575 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.872586 4903 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.872596 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.872606 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/34de9984-0547-4ba1-ae7d-5f8cc9196c26-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.872615 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.877480 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kube-api-access-sfjw5" (OuterVolumeSpecName: "kube-api-access-sfjw5") pod "34de9984-0547-4ba1-ae7d-5f8cc9196c26" (UID: "34de9984-0547-4ba1-ae7d-5f8cc9196c26"). InnerVolumeSpecName "kube-api-access-sfjw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.878854 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "50b5adcb-aed8-4cff-b3ec-02721df3937d" (UID: "50b5adcb-aed8-4cff-b3ec-02721df3937d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.894398 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-config-data" (OuterVolumeSpecName: "config-data") pod "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" (UID: "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.899250 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "50b5adcb-aed8-4cff-b3ec-02721df3937d" (UID: "50b5adcb-aed8-4cff-b3ec-02721df3937d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.910330 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data" (OuterVolumeSpecName: "config-data") pod "50b5adcb-aed8-4cff-b3ec-02721df3937d" (UID: "50b5adcb-aed8-4cff-b3ec-02721df3937d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.912537 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34de9984-0547-4ba1-ae7d-5f8cc9196c26" (UID: "34de9984-0547-4ba1-ae7d-5f8cc9196c26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.922011 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1dcd96a1-71bb-480c-8387-0fca4d17bf33" (UID: "1dcd96a1-71bb-480c-8387-0fca4d17bf33"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.930558 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dcd96a1-71bb-480c-8387-0fca4d17bf33" (UID: "1dcd96a1-71bb-480c-8387-0fca4d17bf33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.931587 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" (UID: "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.932556 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data" (OuterVolumeSpecName: "config-data") pod "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" (UID: "bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.951804 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" (UID: "f4e1dbd8-6ecd-4cd7-910b-910cf6a45679"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.968133 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "34de9984-0547-4ba1-ae7d-5f8cc9196c26" (UID: "34de9984-0547-4ba1-ae7d-5f8cc9196c26"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976215 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976237 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976247 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976258 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976268 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976277 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfjw5\" (UniqueName: \"kubernetes.io/projected/34de9984-0547-4ba1-ae7d-5f8cc9196c26-kube-api-access-sfjw5\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976289 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976299 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976308 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976317 4903 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/34de9984-0547-4ba1-ae7d-5f8cc9196c26-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976326 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b5adcb-aed8-4cff-b3ec-02721df3937d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.976336 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.976425 4903 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 08:49:44 crc kubenswrapper[4903]: E0320 08:49:44.976551 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts podName:91bf045e-8b49-48df-b43e-9040bb6b2ca5 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:45.976522802 +0000 UTC m=+1611.193423107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts") pod "root-account-create-update-56bn4" (UID: "91bf045e-8b49-48df-b43e-9040bb6b2ca5") : configmap "openstack-scripts" not found Mar 20 08:49:44 crc kubenswrapper[4903]: I0320 08:49:44.983564 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1dcd96a1-71bb-480c-8387-0fca4d17bf33" (UID: "1dcd96a1-71bb-480c-8387-0fca4d17bf33"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.022165 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.211:3000/\": dial tcp 10.217.0.211:3000: connect: connection refused" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.078903 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dcd96a1-71bb-480c-8387-0fca4d17bf33-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.083968 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.090493 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.165225 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.181345 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-scripts\") pod \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.181466 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-etc-machine-id\") pod \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.181559 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fdft\" (UniqueName: \"kubernetes.io/projected/dc127483-5a42-4eea-8b8c-8a1382dced05-kube-api-access-2fdft\") pod \"dc127483-5a42-4eea-8b8c-8a1382dced05\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.181610 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data-custom\") pod \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.181653 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-combined-ca-bundle\") pod \"dc127483-5a42-4eea-8b8c-8a1382dced05\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.181730 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-combined-ca-bundle\") pod \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.181794 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data\") pod \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.181841 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-config-data\") pod \"dc127483-5a42-4eea-8b8c-8a1382dced05\" (UID: \"dc127483-5a42-4eea-8b8c-8a1382dced05\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.181888 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l2hb\" (UniqueName: \"kubernetes.io/projected/5e072c5e-0f44-4d24-bccc-b14bf61fa192-kube-api-access-9l2hb\") pod \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.181941 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-combined-ca-bundle\") pod \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.182026 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-config-data\") pod \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\" (UID: \"5e072c5e-0f44-4d24-bccc-b14bf61fa192\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.182124 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcqnr\" (UniqueName: \"kubernetes.io/projected/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-kube-api-access-jcqnr\") pod \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\" (UID: \"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.182254 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" (UID: "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.182669 4903 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.198731 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e072c5e-0f44-4d24-bccc-b14bf61fa192-kube-api-access-9l2hb" (OuterVolumeSpecName: "kube-api-access-9l2hb") pod "5e072c5e-0f44-4d24-bccc-b14bf61fa192" (UID: "5e072c5e-0f44-4d24-bccc-b14bf61fa192"). InnerVolumeSpecName "kube-api-access-9l2hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.204577 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-scripts" (OuterVolumeSpecName: "scripts") pod "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" (UID: "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.209414 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc127483-5a42-4eea-8b8c-8a1382dced05-kube-api-access-2fdft" (OuterVolumeSpecName: "kube-api-access-2fdft") pod "dc127483-5a42-4eea-8b8c-8a1382dced05" (UID: "dc127483-5a42-4eea-8b8c-8a1382dced05"). InnerVolumeSpecName "kube-api-access-2fdft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.210776 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-kube-api-access-jcqnr" (OuterVolumeSpecName: "kube-api-access-jcqnr") pod "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" (UID: "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a"). InnerVolumeSpecName "kube-api-access-jcqnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.218075 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" (UID: "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.235881 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-config-data" (OuterVolumeSpecName: "config-data") pod "5e072c5e-0f44-4d24-bccc-b14bf61fa192" (UID: "5e072c5e-0f44-4d24-bccc-b14bf61fa192"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.250072 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-config-data" (OuterVolumeSpecName: "config-data") pod "dc127483-5a42-4eea-8b8c-8a1382dced05" (UID: "dc127483-5a42-4eea-8b8c-8a1382dced05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.252474 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc127483-5a42-4eea-8b8c-8a1382dced05" (UID: "dc127483-5a42-4eea-8b8c-8a1382dced05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.277070 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e072c5e-0f44-4d24-bccc-b14bf61fa192" (UID: "5e072c5e-0f44-4d24-bccc-b14bf61fa192"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.285673 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm42s\" (UniqueName: \"kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s\") pod \"keystone-7068-account-create-update-hqkgt\" (UID: \"69bc9139-cd82-4fa2-847f-b831c080d163\") " pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.285727 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts\") pod \"keystone-7068-account-create-update-hqkgt\" (UID: \"69bc9139-cd82-4fa2-847f-b831c080d163\") " pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.285834 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.286910 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l2hb\" (UniqueName: \"kubernetes.io/projected/5e072c5e-0f44-4d24-bccc-b14bf61fa192-kube-api-access-9l2hb\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.288244 4903 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.288362 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts podName:69bc9139-cd82-4fa2-847f-b831c080d163 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:47.288334177 +0000 UTC m=+1612.505234482 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts") pod "keystone-7068-account-create-update-hqkgt" (UID: "69bc9139-cd82-4fa2-847f-b831c080d163") : configmap "openstack-scripts" not found Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.288684 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.288782 4903 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.288797 4903 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.288809 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.288821 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.288834 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e072c5e-0f44-4d24-bccc-b14bf61fa192-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.288869 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:53.288858899 +0000 UTC m=+1618.505759394 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.288894 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcqnr\" (UniqueName: \"kubernetes.io/projected/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-kube-api-access-jcqnr\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.288907 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.288919 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fdft\" (UniqueName: \"kubernetes.io/projected/dc127483-5a42-4eea-8b8c-8a1382dced05-kube-api-access-2fdft\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.288929 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.288939 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc127483-5a42-4eea-8b8c-8a1382dced05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.293334 4903 projected.go:194] Error preparing data for projected volume kube-api-access-xm42s for pod openstack/keystone-7068-account-create-update-hqkgt: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.293436 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s podName:69bc9139-cd82-4fa2-847f-b831c080d163 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:47.293409021 +0000 UTC m=+1612.510309336 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xm42s" (UniqueName: "kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s") pod "keystone-7068-account-create-update-hqkgt" (UID: "69bc9139-cd82-4fa2-847f-b831c080d163") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.339733 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" (UID: "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.342236 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data" (OuterVolumeSpecName: "config-data") pod "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" (UID: "9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.372907 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b/ovn-northd/0.log" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.372972 4903 generic.go:334] "Generic (PLEG): container finished" podID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" containerID="5704e6c6ea3db74005a8e1d1aeb869f6812338abc7af8b7e741fc45d5338477c" exitCode=139 Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.373066 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b","Type":"ContainerDied","Data":"5704e6c6ea3db74005a8e1d1aeb869f6812338abc7af8b7e741fc45d5338477c"} Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.382226 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.384635 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f4e1dbd8-6ecd-4cd7-910b-910cf6a45679","Type":"ContainerDied","Data":"e8f4896be311871f3ac0108063c58ec2c400c0620e5af12b5e7ff951a80e4061"} Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.384728 4903 scope.go:117] "RemoveContainer" containerID="8c4db124274a0f1bc8c0540b96509dad9dbb16ca409f3c709b225f29f322a39d" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.390300 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.390339 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.416431 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5895dcfdfd-4gs9b" event={"ID":"bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33","Type":"ContainerDied","Data":"d4cdfb61d9e1894db3ded5c41d5370a86645d505f69eb543c4925c75d51df735"} Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.416635 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5895dcfdfd-4gs9b" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.428740 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.428800 4903 scope.go:117] "RemoveContainer" containerID="e9d3a5c1b4f80a807d17559887982f63b2f87da65ce30784f6d98d67b9f43363" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.434824 4903 generic.go:334] "Generic (PLEG): container finished" podID="dc127483-5a42-4eea-8b8c-8a1382dced05" containerID="086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7" exitCode=0 Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.434892 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc127483-5a42-4eea-8b8c-8a1382dced05","Type":"ContainerDied","Data":"086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7"} Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.434917 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc127483-5a42-4eea-8b8c-8a1382dced05","Type":"ContainerDied","Data":"f20195f10697104fa9e427b643b441c1dac8cbabe31b792bac4d224fd500ede0"} Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.434954 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.442438 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a","Type":"ContainerDied","Data":"63c548ca09d2a2f21c2d5520463bb11e088324de770769e3a18e10fd91b04979"} Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.442663 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.455001 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"34de9984-0547-4ba1-ae7d-5f8cc9196c26","Type":"ContainerDied","Data":"2c4393d7e307ba72b53c3157d0aaef61008e16bbb978c185a6b2752dec453422"} Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.455166 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.457366 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.479280 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.480754 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"5e072c5e-0f44-4d24-bccc-b14bf61fa192","Type":"ContainerDied","Data":"6a80a96d7215a6220aaaf7ae272d2f3f0136c00db333d81e2c40d6243b21fbee"} Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.480796 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.489891 4903 scope.go:117] "RemoveContainer" containerID="399861c0aed674ac17caf4a76e74022608a67747c7b5e6718fd6d0bf4376c5d8" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.505107 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7068-account-create-update-hqkgt" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.505295 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-669bcbb856-w87fq" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.505977 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="032ee1ce-8cf1-4acd-a741-dc32f104065f" path="/var/lib/kubelet/pods/032ee1ce-8cf1-4acd-a741-dc32f104065f/volumes" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.506576 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.506683 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" path="/var/lib/kubelet/pods/2c33b2cd-e705-41cd-9e59-3dcbb0a55829/volumes" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.507391 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc6a13a-9844-4e28-93e6-45025f1385a9" path="/var/lib/kubelet/pods/2dc6a13a-9844-4e28-93e6-45025f1385a9/volumes" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.507798 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54015026-e605-4781-a02e-57b47c61284e" path="/var/lib/kubelet/pods/54015026-e605-4781-a02e-57b47c61284e/volumes" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.509874 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbbd0a7-f915-4197-bde8-4f96590c454f" path="/var/lib/kubelet/pods/7bbbd0a7-f915-4197-bde8-4f96590c454f/volumes" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.510586 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" path="/var/lib/kubelet/pods/7f5f160c-29e2-43d0-bb55-6969904b3a4e/volumes" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.511971 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" path="/var/lib/kubelet/pods/877f943b-808c-435e-a5cf-bda8ea0a5d15/volumes" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.512685 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" path="/var/lib/kubelet/pods/f4e1dbd8-6ecd-4cd7-910b-910cf6a45679/volumes" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.513302 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8c76743-cd0d-48d8-940e-a5e750bd1fcc" path="/var/lib/kubelet/pods/f8c76743-cd0d-48d8-940e-a5e750bd1fcc/volumes" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.513701 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feeda47f-bf82-4e99-a704-b405a817bb1d" path="/var/lib/kubelet/pods/feeda47f-bf82-4e99-a704-b405a817bb1d/volumes" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.518248 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"50b5adcb-aed8-4cff-b3ec-02721df3937d","Type":"ContainerDied","Data":"8b2af6fd3e4c816e215b1b167a8260cf0e8e49daa50f275b197c42f365dee587"} Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.518303 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5895dcfdfd-4gs9b"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.535685 4903 scope.go:117] "RemoveContainer" containerID="93597afa34681cad8c7e33fa9d9d2b8edff2db1b4f63723973a392f7d94f6d4d" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.557387 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5895dcfdfd-4gs9b"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.582141 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.591798 4903 scope.go:117] "RemoveContainer" containerID="086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.609622 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.628586 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.635433 4903 scope.go:117] "RemoveContainer" containerID="086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7" Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.635975 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7\": container with ID starting with 086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7 not found: ID does not exist" containerID="086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.636061 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7"} err="failed to get container status \"086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7\": rpc error: code = NotFound desc = could not find container \"086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7\": container with ID starting with 086baa0efbede9405a9d07461836e115d7cb27a7069334c37e8698d90f636ed7 not found: ID does not exist" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.636107 4903 scope.go:117] "RemoveContainer" containerID="4cb95c6e5180c8a94f8474533f104c8ca5edf99bbf830ed9f71c73b944a44ab1" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.654145 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.671798 4903 scope.go:117] "RemoveContainer" containerID="cb7bb133830dc4bdac2cbea0a778366365297bc4c7bc623ae8334776f5496711" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.671989 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.682979 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.700992 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.718328 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.725662 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.726011 4903 scope.go:117] "RemoveContainer" containerID="b0e941c6eb837ac5cb4a6c3fdea1d2d35578b2c2d7f9749cd5b2c8e0e7710ff8" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.726160 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.728402 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.733861 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 08:49:45 crc kubenswrapper[4903]: E0320 08:49:45.733910 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="d570ab6f-6c5f-4255-b2ae-1966da262a0d" containerName="nova-cell0-conductor-conductor" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.741321 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7068-account-create-update-hqkgt"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.795177 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7068-account-create-update-hqkgt"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.802607 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b/ovn-northd/0.log" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.802784 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.832159 4903 scope.go:117] "RemoveContainer" containerID="9148a8a0458280fb77a8371f10cf7fabff0ae90ccd681be1905e0e67a3249152" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.921732 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-combined-ca-bundle\") pod \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.921799 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg4vt\" (UniqueName: \"kubernetes.io/projected/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-kube-api-access-zg4vt\") pod \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.921944 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-config\") pod \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.921965 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-northd-tls-certs\") pod \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.921989 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-metrics-certs-tls-certs\") pod \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.922013 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-rundir\") pod \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.922067 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-scripts\") pod \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\" (UID: \"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b\") " Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.922467 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69bc9139-cd82-4fa2-847f-b831c080d163-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.922485 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm42s\" (UniqueName: \"kubernetes.io/projected/69bc9139-cd82-4fa2-847f-b831c080d163-kube-api-access-xm42s\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.923751 4903 scope.go:117] "RemoveContainer" containerID="dc39193e0b3efc7d58828ef8c691abe141cb7d978ac576bd15dc6776a511b5fe" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.923947 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-669bcbb856-w87fq"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.935991 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" (UID: "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.936196 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-669bcbb856-w87fq"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.938900 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-config" (OuterVolumeSpecName: "config") pod "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" (UID: "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.940607 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-scripts" (OuterVolumeSpecName: "scripts") pod "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" (UID: "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.951982 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-kube-api-access-zg4vt" (OuterVolumeSpecName: "kube-api-access-zg4vt") pod "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" (UID: "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b"). InnerVolumeSpecName "kube-api-access-zg4vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.955297 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:49:45 crc kubenswrapper[4903]: I0320 08:49:45.971517 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.003585 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.004297 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" (UID: "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.015958 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.024034 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.024080 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.024091 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.024100 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.024115 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg4vt\" (UniqueName: \"kubernetes.io/projected/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-kube-api-access-zg4vt\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: E0320 08:49:46.024184 4903 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 20 08:49:46 crc kubenswrapper[4903]: E0320 08:49:46.024220 4903 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 08:49:46 crc kubenswrapper[4903]: E0320 08:49:46.024257 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data podName:df937948-08c4-447c-9450-07221ce76552 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:54.024239197 +0000 UTC m=+1619.241139512 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data") pod "rabbitmq-cell1-server-0" (UID: "df937948-08c4-447c-9450-07221ce76552") : configmap "rabbitmq-cell1-config-data" not found Mar 20 08:49:46 crc kubenswrapper[4903]: E0320 08:49:46.024276 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts podName:91bf045e-8b49-48df-b43e-9040bb6b2ca5 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:48.024269478 +0000 UTC m=+1613.241169793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts") pod "root-account-create-update-56bn4" (UID: "91bf045e-8b49-48df-b43e-9040bb6b2ca5") : configmap "openstack-scripts" not found Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.080934 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" (UID: "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.085390 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" (UID: "06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.125617 4903 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.125984 4903 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.225008 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-56bn4" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.267208 4903 scope.go:117] "RemoveContainer" containerID="bbc28588129fed5e832d9cf2c208bd4c746332410777ee79ad509494e640c235" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.330264 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts\") pod \"91bf045e-8b49-48df-b43e-9040bb6b2ca5\" (UID: \"91bf045e-8b49-48df-b43e-9040bb6b2ca5\") " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.330387 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t2th\" (UniqueName: \"kubernetes.io/projected/91bf045e-8b49-48df-b43e-9040bb6b2ca5-kube-api-access-9t2th\") pod \"91bf045e-8b49-48df-b43e-9040bb6b2ca5\" (UID: \"91bf045e-8b49-48df-b43e-9040bb6b2ca5\") " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.331713 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91bf045e-8b49-48df-b43e-9040bb6b2ca5" (UID: "91bf045e-8b49-48df-b43e-9040bb6b2ca5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.332403 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91bf045e-8b49-48df-b43e-9040bb6b2ca5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.334239 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91bf045e-8b49-48df-b43e-9040bb6b2ca5-kube-api-access-9t2th" (OuterVolumeSpecName: "kube-api-access-9t2th") pod "91bf045e-8b49-48df-b43e-9040bb6b2ca5" (UID: "91bf045e-8b49-48df-b43e-9040bb6b2ca5"). InnerVolumeSpecName "kube-api-access-9t2th". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.433466 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t2th\" (UniqueName: \"kubernetes.io/projected/91bf045e-8b49-48df-b43e-9040bb6b2ca5-kube-api-access-9t2th\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.440931 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.476749 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="df937948-08c4-447c-9450-07221ce76552" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.524362 4903 generic.go:334] "Generic (PLEG): container finished" podID="96a68183-d440-4f89-887d-d2441d00c8e4" containerID="b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9" exitCode=0 Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.524470 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"96a68183-d440-4f89-887d-d2441d00c8e4","Type":"ContainerDied","Data":"b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9"} Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.524513 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"96a68183-d440-4f89-887d-d2441d00c8e4","Type":"ContainerDied","Data":"b564313f743e3cc52d8be5ffa4e55cfc64ecfd415feda7073b827864c2307ce8"} Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.524539 4903 scope.go:117] "RemoveContainer" containerID="b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.524714 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.530562 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-56bn4" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.530567 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-56bn4" event={"ID":"91bf045e-8b49-48df-b43e-9040bb6b2ca5","Type":"ContainerDied","Data":"340cf727ff698591f1edf8375d21ef430e1c0d6d29845adf1c391d9354f99a37"} Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.533960 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b/ovn-northd/0.log" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.534101 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b","Type":"ContainerDied","Data":"482479ce04d5f833f81d72dc75f610bb8c7a4314fd274b583ad244727ea260f6"} Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.534197 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.535168 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-galera-tls-certs\") pod \"96a68183-d440-4f89-887d-d2441d00c8e4\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.535190 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-combined-ca-bundle\") pod \"96a68183-d440-4f89-887d-d2441d00c8e4\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.535251 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-operator-scripts\") pod \"96a68183-d440-4f89-887d-d2441d00c8e4\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.535286 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-default\") pod \"96a68183-d440-4f89-887d-d2441d00c8e4\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.535318 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptgpx\" (UniqueName: \"kubernetes.io/projected/96a68183-d440-4f89-887d-d2441d00c8e4-kube-api-access-ptgpx\") pod \"96a68183-d440-4f89-887d-d2441d00c8e4\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.535339 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-kolla-config\") pod \"96a68183-d440-4f89-887d-d2441d00c8e4\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.535390 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"96a68183-d440-4f89-887d-d2441d00c8e4\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.535408 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-generated\") pod \"96a68183-d440-4f89-887d-d2441d00c8e4\" (UID: \"96a68183-d440-4f89-887d-d2441d00c8e4\") " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.536556 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "96a68183-d440-4f89-887d-d2441d00c8e4" (UID: "96a68183-d440-4f89-887d-d2441d00c8e4"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.537917 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "96a68183-d440-4f89-887d-d2441d00c8e4" (UID: "96a68183-d440-4f89-887d-d2441d00c8e4"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.538627 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96a68183-d440-4f89-887d-d2441d00c8e4" (UID: "96a68183-d440-4f89-887d-d2441d00c8e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.541083 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "96a68183-d440-4f89-887d-d2441d00c8e4" (UID: "96a68183-d440-4f89-887d-d2441d00c8e4"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.544980 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a68183-d440-4f89-887d-d2441d00c8e4-kube-api-access-ptgpx" (OuterVolumeSpecName: "kube-api-access-ptgpx") pod "96a68183-d440-4f89-887d-d2441d00c8e4" (UID: "96a68183-d440-4f89-887d-d2441d00c8e4"). InnerVolumeSpecName "kube-api-access-ptgpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.548765 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "96a68183-d440-4f89-887d-d2441d00c8e4" (UID: "96a68183-d440-4f89-887d-d2441d00c8e4"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.564145 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96a68183-d440-4f89-887d-d2441d00c8e4" (UID: "96a68183-d440-4f89-887d-d2441d00c8e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.590799 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "96a68183-d440-4f89-887d-d2441d00c8e4" (UID: "96a68183-d440-4f89-887d-d2441d00c8e4"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.619494 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.619697 4903 scope.go:117] "RemoveContainer" containerID="e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.638922 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.638960 4903 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/96a68183-d440-4f89-887d-d2441d00c8e4-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.638974 4903 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.638983 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.638993 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptgpx\" (UniqueName: \"kubernetes.io/projected/96a68183-d440-4f89-887d-d2441d00c8e4-kube-api-access-ptgpx\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.639010 4903 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96a68183-d440-4f89-887d-d2441d00c8e4-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.639051 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.639066 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/96a68183-d440-4f89-887d-d2441d00c8e4-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.642578 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.666365 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-56bn4"] Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.675396 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-56bn4"] Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.683248 4903 scope.go:117] "RemoveContainer" containerID="b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9" Mar 20 08:49:46 crc kubenswrapper[4903]: E0320 08:49:46.684612 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9\": container with ID starting with b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9 not found: ID does not exist" containerID="b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.684649 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9"} err="failed to get container status \"b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9\": rpc error: code = NotFound desc = could not find container \"b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9\": container with ID starting with b1d064dc3009f3f7fdd7eea64f3901b6af862ba55d21798e211d4e04f25facd9 not found: ID does not exist" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.684669 4903 scope.go:117] "RemoveContainer" containerID="e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd" Mar 20 08:49:46 crc kubenswrapper[4903]: E0320 08:49:46.685066 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd\": container with ID starting with e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd not found: ID does not exist" containerID="e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.685118 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd"} err="failed to get container status \"e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd\": rpc error: code = NotFound desc = could not find container \"e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd\": container with ID starting with e871ff9680c97fedf425745e453d486d14a0134e064de88d4aabcbf21bd800fd not found: ID does not exist" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.685146 4903 scope.go:117] "RemoveContainer" containerID="7125bc754a8c0e626fcd2fe281119d9040b278b420956ad513958b106967fd43" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.697689 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.704265 4903 scope.go:117] "RemoveContainer" containerID="5704e6c6ea3db74005a8e1d1aeb869f6812338abc7af8b7e741fc45d5338477c" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.740607 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.900353 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 08:49:46 crc kubenswrapper[4903]: I0320 08:49:46.905018 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 08:49:46 crc kubenswrapper[4903]: E0320 08:49:46.944590 4903 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 20 08:49:46 crc kubenswrapper[4903]: E0320 08:49:46.944679 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data podName:888a3fd9-01f8-47b3-b1bb-f2b8b6b96509 nodeName:}" failed. No retries permitted until 2026-03-20 08:49:54.944655571 +0000 UTC m=+1620.161555886 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data") pod "rabbitmq-server-0" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509") : configmap "rabbitmq-config-data" not found Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.056379 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.078743 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.083173 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.091277 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.091345 4903 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.092112 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.094439 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.097706 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.097741 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovs-vswitchd" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.133558 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.248832 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-pod-info\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.248886 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-server-conf\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.248906 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-plugins-conf\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.248932 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-erlang-cookie\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.248957 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df937948-08c4-447c-9450-07221ce76552-erlang-cookie-secret\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.248976 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-server-conf\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.248995 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c54mp\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-kube-api-access-c54mp\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249032 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-erlang-cookie\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249066 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249109 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249140 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-erlang-cookie-secret\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249170 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-confd\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249209 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-plugins\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249238 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-tls\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249257 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-confd\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249308 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-plugins\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249343 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df937948-08c4-447c-9450-07221ce76552-pod-info\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249388 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-tls\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249427 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249448 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djs78\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-kube-api-access-djs78\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249467 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"df937948-08c4-447c-9450-07221ce76552\" (UID: \"df937948-08c4-447c-9450-07221ce76552\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249499 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-plugins-conf\") pod \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\" (UID: \"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.249742 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.250320 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.251117 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.256254 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df937948-08c4-447c-9450-07221ce76552-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.260653 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.266527 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.266673 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.267362 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.272365 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-kube-api-access-c54mp" (OuterVolumeSpecName: "kube-api-access-c54mp") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "kube-api-access-c54mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.272618 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.272635 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.277817 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-pod-info" (OuterVolumeSpecName: "pod-info") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.279492 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.289281 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.291461 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-kube-api-access-djs78" (OuterVolumeSpecName: "kube-api-access-djs78") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "kube-api-access-djs78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.298182 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/df937948-08c4-447c-9450-07221ce76552-pod-info" (OuterVolumeSpecName: "pod-info") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.312669 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data" (OuterVolumeSpecName: "config-data") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.313780 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data" (OuterVolumeSpecName: "config-data") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.344970 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-server-conf" (OuterVolumeSpecName: "server-conf") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.347785 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-server-conf" (OuterVolumeSpecName: "server-conf") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.351951 4903 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352058 4903 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352075 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352087 4903 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/df937948-08c4-447c-9450-07221ce76552-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352097 4903 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352109 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c54mp\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-kube-api-access-c54mp\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352119 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352251 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352270 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352282 4903 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352292 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352301 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352310 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/df937948-08c4-447c-9450-07221ce76552-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352319 4903 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/df937948-08c4-447c-9450-07221ce76552-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352328 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352339 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df937948-08c4-447c-9450-07221ce76552-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352348 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djs78\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-kube-api-access-djs78\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352366 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352374 4903 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.352385 4903 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.378741 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.378951 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.384565 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.402495 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "df937948-08c4-447c-9450-07221ce76552" (UID: "df937948-08c4-447c-9450-07221ce76552"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.404637 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" (UID: "888a3fd9-01f8-47b3-b1bb-f2b8b6b96509"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.461578 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.461616 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/df937948-08c4-447c-9450-07221ce76552-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.461627 4903 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.461640 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.512645 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" path="/var/lib/kubelet/pods/06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.513455 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" path="/var/lib/kubelet/pods/1dcd96a1-71bb-480c-8387-0fca4d17bf33/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.514109 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34de9984-0547-4ba1-ae7d-5f8cc9196c26" path="/var/lib/kubelet/pods/34de9984-0547-4ba1-ae7d-5f8cc9196c26/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.527623 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b5adcb-aed8-4cff-b3ec-02721df3937d" path="/var/lib/kubelet/pods/50b5adcb-aed8-4cff-b3ec-02721df3937d/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.551430 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e072c5e-0f44-4d24-bccc-b14bf61fa192" path="/var/lib/kubelet/pods/5e072c5e-0f44-4d24-bccc-b14bf61fa192/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.551989 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69bc9139-cd82-4fa2-847f-b831c080d163" path="/var/lib/kubelet/pods/69bc9139-cd82-4fa2-847f-b831c080d163/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.552395 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91bf045e-8b49-48df-b43e-9040bb6b2ca5" path="/var/lib/kubelet/pods/91bf045e-8b49-48df-b43e-9040bb6b2ca5/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.552975 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a68183-d440-4f89-887d-d2441d00c8e4" path="/var/lib/kubelet/pods/96a68183-d440-4f89-887d-d2441d00c8e4/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.563302 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-credential-keys\") pod \"c94a513f-1b70-4705-af6c-3f71cb0e4272\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.563510 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-public-tls-certs\") pod \"c94a513f-1b70-4705-af6c-3f71cb0e4272\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.563639 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-combined-ca-bundle\") pod \"c94a513f-1b70-4705-af6c-3f71cb0e4272\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.563755 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-scripts\") pod \"c94a513f-1b70-4705-af6c-3f71cb0e4272\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.563842 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-internal-tls-certs\") pod \"c94a513f-1b70-4705-af6c-3f71cb0e4272\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.563952 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-fernet-keys\") pod \"c94a513f-1b70-4705-af6c-3f71cb0e4272\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.564104 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52v8h\" (UniqueName: \"kubernetes.io/projected/c94a513f-1b70-4705-af6c-3f71cb0e4272-kube-api-access-52v8h\") pod \"c94a513f-1b70-4705-af6c-3f71cb0e4272\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.564191 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-config-data\") pod \"c94a513f-1b70-4705-af6c-3f71cb0e4272\" (UID: \"c94a513f-1b70-4705-af6c-3f71cb0e4272\") " Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.565908 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" path="/var/lib/kubelet/pods/9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.567024 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" path="/var/lib/kubelet/pods/bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.567866 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc127483-5a42-4eea-8b8c-8a1382dced05" path="/var/lib/kubelet/pods/dc127483-5a42-4eea-8b8c-8a1382dced05/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.576575 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bb60e5-f963-44ed-9e5e-76ca6da5c723" path="/var/lib/kubelet/pods/f8bb60e5-f963-44ed-9e5e-76ca6da5c723/volumes" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.589246 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-scripts" (OuterVolumeSpecName: "scripts") pod "c94a513f-1b70-4705-af6c-3f71cb0e4272" (UID: "c94a513f-1b70-4705-af6c-3f71cb0e4272"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.591425 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94a513f-1b70-4705-af6c-3f71cb0e4272-kube-api-access-52v8h" (OuterVolumeSpecName: "kube-api-access-52v8h") pod "c94a513f-1b70-4705-af6c-3f71cb0e4272" (UID: "c94a513f-1b70-4705-af6c-3f71cb0e4272"). InnerVolumeSpecName "kube-api-access-52v8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.592584 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c94a513f-1b70-4705-af6c-3f71cb0e4272" (UID: "c94a513f-1b70-4705-af6c-3f71cb0e4272"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.596242 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c94a513f-1b70-4705-af6c-3f71cb0e4272" (UID: "c94a513f-1b70-4705-af6c-3f71cb0e4272"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.612912 4903 generic.go:334] "Generic (PLEG): container finished" podID="df937948-08c4-447c-9450-07221ce76552" containerID="80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804" exitCode=0 Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.613202 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"df937948-08c4-447c-9450-07221ce76552","Type":"ContainerDied","Data":"80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804"} Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.613286 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"df937948-08c4-447c-9450-07221ce76552","Type":"ContainerDied","Data":"91715135ba864eae23740ffcb37edde4b941086e9d6b0fd4623a13a3b8ec6cf0"} Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.613483 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.613758 4903 scope.go:117] "RemoveContainer" containerID="80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.615198 4903 generic.go:334] "Generic (PLEG): container finished" podID="c94a513f-1b70-4705-af6c-3f71cb0e4272" containerID="9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d" exitCode=0 Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.615279 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f784d4489-rxkmk" event={"ID":"c94a513f-1b70-4705-af6c-3f71cb0e4272","Type":"ContainerDied","Data":"9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d"} Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.615332 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f784d4489-rxkmk" event={"ID":"c94a513f-1b70-4705-af6c-3f71cb0e4272","Type":"ContainerDied","Data":"f7353b6ff942ff4f204ab5c2d464ab1ef4b78615acf0084c110ec9caf9ca2f33"} Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.615405 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f784d4489-rxkmk" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.674459 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.674510 4903 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.674522 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52v8h\" (UniqueName: \"kubernetes.io/projected/c94a513f-1b70-4705-af6c-3f71cb0e4272-kube-api-access-52v8h\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.674534 4903 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.697305 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-config-data" (OuterVolumeSpecName: "config-data") pod "c94a513f-1b70-4705-af6c-3f71cb0e4272" (UID: "c94a513f-1b70-4705-af6c-3f71cb0e4272"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.700770 4903 generic.go:334] "Generic (PLEG): container finished" podID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" containerID="bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04" exitCode=0 Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.700847 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509","Type":"ContainerDied","Data":"bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04"} Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.700886 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"888a3fd9-01f8-47b3-b1bb-f2b8b6b96509","Type":"ContainerDied","Data":"50cdcb1ab3eaee1006c1555da57117c90c23c9e1de732b5014ad14a0c2ce59cb"} Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.701061 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.718227 4903 scope.go:117] "RemoveContainer" containerID="fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.721314 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.723781 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.730223 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c94a513f-1b70-4705-af6c-3f71cb0e4272" (UID: "c94a513f-1b70-4705-af6c-3f71cb0e4272"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.741676 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c94a513f-1b70-4705-af6c-3f71cb0e4272" (UID: "c94a513f-1b70-4705-af6c-3f71cb0e4272"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.755135 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c94a513f-1b70-4705-af6c-3f71cb0e4272" (UID: "c94a513f-1b70-4705-af6c-3f71cb0e4272"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.765265 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.772451 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.776813 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.776871 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.776882 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.776892 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94a513f-1b70-4705-af6c-3f71cb0e4272-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.783308 4903 scope.go:117] "RemoveContainer" containerID="80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804" Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.784948 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804\": container with ID starting with 80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804 not found: ID does not exist" containerID="80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.785081 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804"} err="failed to get container status \"80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804\": rpc error: code = NotFound desc = could not find container \"80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804\": container with ID starting with 80ffdb9ab414e2251c93f568f37534af22de836fa31c4fbc8bf8f3ec7bf93804 not found: ID does not exist" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.785167 4903 scope.go:117] "RemoveContainer" containerID="fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9" Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.785768 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9\": container with ID starting with fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9 not found: ID does not exist" containerID="fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.785820 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9"} err="failed to get container status \"fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9\": rpc error: code = NotFound desc = could not find container \"fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9\": container with ID starting with fe57b0018bbffb7366eaf34a9f7b2d185d56311f1f577d783cba8ee7a58367b9 not found: ID does not exist" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.785857 4903 scope.go:117] "RemoveContainer" containerID="9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.816165 4903 scope.go:117] "RemoveContainer" containerID="9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d" Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.816719 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d\": container with ID starting with 9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d not found: ID does not exist" containerID="9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.816751 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d"} err="failed to get container status \"9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d\": rpc error: code = NotFound desc = could not find container \"9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d\": container with ID starting with 9ecef6613b51007ecf1f1f800f64c787552051192488f5ff207b2c73e1a6013d not found: ID does not exist" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.816787 4903 scope.go:117] "RemoveContainer" containerID="bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.848103 4903 scope.go:117] "RemoveContainer" containerID="79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.925359 4903 scope.go:117] "RemoveContainer" containerID="bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04" Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.925946 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04\": container with ID starting with bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04 not found: ID does not exist" containerID="bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.926472 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04"} err="failed to get container status \"bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04\": rpc error: code = NotFound desc = could not find container \"bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04\": container with ID starting with bd2b54448e402fa4a59c36c7f69e6069fbf3f50b83543508e66b689857b68d04 not found: ID does not exist" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.926511 4903 scope.go:117] "RemoveContainer" containerID="79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31" Mar 20 08:49:47 crc kubenswrapper[4903]: E0320 08:49:47.927122 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31\": container with ID starting with 79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31 not found: ID does not exist" containerID="79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.927143 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31"} err="failed to get container status \"79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31\": rpc error: code = NotFound desc = could not find container \"79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31\": container with ID starting with 79119e778548ae72a178bf5caebc6d4cbad9e6b178a28bc331be14a3707c0b31 not found: ID does not exist" Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.973352 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7f784d4489-rxkmk"] Mar 20 08:49:47 crc kubenswrapper[4903]: I0320 08:49:47.987785 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7f784d4489-rxkmk"] Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.502684 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.581943 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.693407 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhddf\" (UniqueName: \"kubernetes.io/projected/31664b72-a142-4656-88e8-84dd0cf18647-kube-api-access-fhddf\") pod \"31664b72-a142-4656-88e8-84dd0cf18647\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.693482 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data-custom\") pod \"31664b72-a142-4656-88e8-84dd0cf18647\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.693510 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31664b72-a142-4656-88e8-84dd0cf18647-logs\") pod \"31664b72-a142-4656-88e8-84dd0cf18647\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.693529 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-combined-ca-bundle\") pod \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.693568 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-combined-ca-bundle\") pod \"31664b72-a142-4656-88e8-84dd0cf18647\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.693602 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntdfg\" (UniqueName: \"kubernetes.io/projected/d570ab6f-6c5f-4255-b2ae-1966da262a0d-kube-api-access-ntdfg\") pod \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.693622 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-config-data\") pod \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\" (UID: \"d570ab6f-6c5f-4255-b2ae-1966da262a0d\") " Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.693654 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data\") pod \"31664b72-a142-4656-88e8-84dd0cf18647\" (UID: \"31664b72-a142-4656-88e8-84dd0cf18647\") " Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.695219 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31664b72-a142-4656-88e8-84dd0cf18647-logs" (OuterVolumeSpecName: "logs") pod "31664b72-a142-4656-88e8-84dd0cf18647" (UID: "31664b72-a142-4656-88e8-84dd0cf18647"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.698854 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31664b72-a142-4656-88e8-84dd0cf18647-kube-api-access-fhddf" (OuterVolumeSpecName: "kube-api-access-fhddf") pod "31664b72-a142-4656-88e8-84dd0cf18647" (UID: "31664b72-a142-4656-88e8-84dd0cf18647"). InnerVolumeSpecName "kube-api-access-fhddf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.704086 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d570ab6f-6c5f-4255-b2ae-1966da262a0d-kube-api-access-ntdfg" (OuterVolumeSpecName: "kube-api-access-ntdfg") pod "d570ab6f-6c5f-4255-b2ae-1966da262a0d" (UID: "d570ab6f-6c5f-4255-b2ae-1966da262a0d"). InnerVolumeSpecName "kube-api-access-ntdfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.706014 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "31664b72-a142-4656-88e8-84dd0cf18647" (UID: "31664b72-a142-4656-88e8-84dd0cf18647"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.739448 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d570ab6f-6c5f-4255-b2ae-1966da262a0d" (UID: "d570ab6f-6c5f-4255-b2ae-1966da262a0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.742535 4903 generic.go:334] "Generic (PLEG): container finished" podID="8005d467-6a20-4e68-b62f-65ad97a31812" containerID="d273844594da563a28c607f77733e2141f72daa05b9f296500d2582410c90a0f" exitCode=0 Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.742603 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8005d467-6a20-4e68-b62f-65ad97a31812","Type":"ContainerDied","Data":"d273844594da563a28c607f77733e2141f72daa05b9f296500d2582410c90a0f"} Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.743543 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-config-data" (OuterVolumeSpecName: "config-data") pod "d570ab6f-6c5f-4255-b2ae-1966da262a0d" (UID: "d570ab6f-6c5f-4255-b2ae-1966da262a0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.744843 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data" (OuterVolumeSpecName: "config-data") pod "31664b72-a142-4656-88e8-84dd0cf18647" (UID: "31664b72-a142-4656-88e8-84dd0cf18647"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.747736 4903 generic.go:334] "Generic (PLEG): container finished" podID="d570ab6f-6c5f-4255-b2ae-1966da262a0d" containerID="bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" exitCode=0 Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.747910 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d570ab6f-6c5f-4255-b2ae-1966da262a0d","Type":"ContainerDied","Data":"bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce"} Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.747989 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d570ab6f-6c5f-4255-b2ae-1966da262a0d","Type":"ContainerDied","Data":"3b32e8e9be55850eb9afeb0b7dffe3030dac267dd2f1642dc8a6c6337513227e"} Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.748014 4903 scope.go:117] "RemoveContainer" containerID="bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.748287 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.749691 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31664b72-a142-4656-88e8-84dd0cf18647" (UID: "31664b72-a142-4656-88e8-84dd0cf18647"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.770964 4903 generic.go:334] "Generic (PLEG): container finished" podID="31664b72-a142-4656-88e8-84dd0cf18647" containerID="2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5" exitCode=0 Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.771070 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" event={"ID":"31664b72-a142-4656-88e8-84dd0cf18647","Type":"ContainerDied","Data":"2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5"} Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.771100 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" event={"ID":"31664b72-a142-4656-88e8-84dd0cf18647","Type":"ContainerDied","Data":"dcbf502e2e9ffd196348392f163aea90c5b00bdbbb5efbcca5b67eb7642dc7c8"} Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.771173 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.796277 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntdfg\" (UniqueName: \"kubernetes.io/projected/d570ab6f-6c5f-4255-b2ae-1966da262a0d-kube-api-access-ntdfg\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.796439 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.796491 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.796503 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhddf\" (UniqueName: \"kubernetes.io/projected/31664b72-a142-4656-88e8-84dd0cf18647-kube-api-access-fhddf\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.796511 4903 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.796519 4903 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31664b72-a142-4656-88e8-84dd0cf18647-logs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.796527 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d570ab6f-6c5f-4255-b2ae-1966da262a0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.796535 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31664b72-a142-4656-88e8-84dd0cf18647-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.821201 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.821746 4903 scope.go:117] "RemoveContainer" containerID="bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" Mar 20 08:49:48 crc kubenswrapper[4903]: E0320 08:49:48.822307 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce\": container with ID starting with bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce not found: ID does not exist" containerID="bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.822349 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce"} err="failed to get container status \"bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce\": rpc error: code = NotFound desc = could not find container \"bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce\": container with ID starting with bdd5b2050c318bfa21071aa9a58547dc85552f9ed34b3d557a8244e9e4292bce not found: ID does not exist" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.822367 4903 scope.go:117] "RemoveContainer" containerID="2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.828415 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.848475 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc"] Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.863704 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6d9bbc5dbb-w66cc"] Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.871271 4903 scope.go:117] "RemoveContainer" containerID="34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.912382 4903 scope.go:117] "RemoveContainer" containerID="2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5" Mar 20 08:49:48 crc kubenswrapper[4903]: E0320 08:49:48.913106 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5\": container with ID starting with 2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5 not found: ID does not exist" containerID="2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.913145 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5"} err="failed to get container status \"2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5\": rpc error: code = NotFound desc = could not find container \"2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5\": container with ID starting with 2ede8df05970b0155e78a7003a750f87a7d86f9769520e754016db5a2253cdb5 not found: ID does not exist" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.913173 4903 scope.go:117] "RemoveContainer" containerID="34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8" Mar 20 08:49:48 crc kubenswrapper[4903]: E0320 08:49:48.917109 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8\": container with ID starting with 34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8 not found: ID does not exist" containerID="34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8" Mar 20 08:49:48 crc kubenswrapper[4903]: I0320 08:49:48.917171 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8"} err="failed to get container status \"34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8\": rpc error: code = NotFound desc = could not find container \"34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8\": container with ID starting with 34d3b31d46d789c0904ffebfdeb9f9aa758a26a4d97333874c5afa3adf9f72b8 not found: ID does not exist" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.140386 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.308751 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-log-httpd\") pod \"8005d467-6a20-4e68-b62f-65ad97a31812\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.308809 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-sg-core-conf-yaml\") pod \"8005d467-6a20-4e68-b62f-65ad97a31812\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.308881 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcvkq\" (UniqueName: \"kubernetes.io/projected/8005d467-6a20-4e68-b62f-65ad97a31812-kube-api-access-hcvkq\") pod \"8005d467-6a20-4e68-b62f-65ad97a31812\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.308932 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-ceilometer-tls-certs\") pod \"8005d467-6a20-4e68-b62f-65ad97a31812\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.308955 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-combined-ca-bundle\") pod \"8005d467-6a20-4e68-b62f-65ad97a31812\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.309015 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-run-httpd\") pod \"8005d467-6a20-4e68-b62f-65ad97a31812\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.309109 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-scripts\") pod \"8005d467-6a20-4e68-b62f-65ad97a31812\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.309132 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-config-data\") pod \"8005d467-6a20-4e68-b62f-65ad97a31812\" (UID: \"8005d467-6a20-4e68-b62f-65ad97a31812\") " Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.310396 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8005d467-6a20-4e68-b62f-65ad97a31812" (UID: "8005d467-6a20-4e68-b62f-65ad97a31812"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.310435 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8005d467-6a20-4e68-b62f-65ad97a31812" (UID: "8005d467-6a20-4e68-b62f-65ad97a31812"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.316573 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8005d467-6a20-4e68-b62f-65ad97a31812-kube-api-access-hcvkq" (OuterVolumeSpecName: "kube-api-access-hcvkq") pod "8005d467-6a20-4e68-b62f-65ad97a31812" (UID: "8005d467-6a20-4e68-b62f-65ad97a31812"). InnerVolumeSpecName "kube-api-access-hcvkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.316750 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-scripts" (OuterVolumeSpecName: "scripts") pod "8005d467-6a20-4e68-b62f-65ad97a31812" (UID: "8005d467-6a20-4e68-b62f-65ad97a31812"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.347529 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8005d467-6a20-4e68-b62f-65ad97a31812" (UID: "8005d467-6a20-4e68-b62f-65ad97a31812"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.385828 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8005d467-6a20-4e68-b62f-65ad97a31812" (UID: "8005d467-6a20-4e68-b62f-65ad97a31812"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.386863 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8005d467-6a20-4e68-b62f-65ad97a31812" (UID: "8005d467-6a20-4e68-b62f-65ad97a31812"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.411050 4903 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.411092 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.411103 4903 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.411113 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.411128 4903 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8005d467-6a20-4e68-b62f-65ad97a31812-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.411141 4903 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.411155 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcvkq\" (UniqueName: \"kubernetes.io/projected/8005d467-6a20-4e68-b62f-65ad97a31812-kube-api-access-hcvkq\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.424065 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-config-data" (OuterVolumeSpecName: "config-data") pod "8005d467-6a20-4e68-b62f-65ad97a31812" (UID: "8005d467-6a20-4e68-b62f-65ad97a31812"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.501910 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31664b72-a142-4656-88e8-84dd0cf18647" path="/var/lib/kubelet/pods/31664b72-a142-4656-88e8-84dd0cf18647/volumes" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.503011 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" path="/var/lib/kubelet/pods/888a3fd9-01f8-47b3-b1bb-f2b8b6b96509/volumes" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.503910 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94a513f-1b70-4705-af6c-3f71cb0e4272" path="/var/lib/kubelet/pods/c94a513f-1b70-4705-af6c-3f71cb0e4272/volumes" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.505198 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d570ab6f-6c5f-4255-b2ae-1966da262a0d" path="/var/lib/kubelet/pods/d570ab6f-6c5f-4255-b2ae-1966da262a0d/volumes" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.506138 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df937948-08c4-447c-9450-07221ce76552" path="/var/lib/kubelet/pods/df937948-08c4-447c-9450-07221ce76552/volumes" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.513683 4903 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8005d467-6a20-4e68-b62f-65ad97a31812-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.608233 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="34de9984-0547-4ba1-ae7d-5f8cc9196c26" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.106:11211: i/o timeout" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.788919 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8005d467-6a20-4e68-b62f-65ad97a31812","Type":"ContainerDied","Data":"8fff4d5535d1d940f4f4c451c78082ae32c553a62eb876a001727b4ad53184ed"} Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.789006 4903 scope.go:117] "RemoveContainer" containerID="fe129b49b118ce4ae8579680e5a1ff682ff67dcbd9c4f54a13ce05fb9eced8cd" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.790354 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.828098 4903 scope.go:117] "RemoveContainer" containerID="3eb344be279e7505b3a1ee366d11602b2711ebf94ecb9c7d600238fd0ec65c58" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.851185 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.857335 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.886855 4903 scope.go:117] "RemoveContainer" containerID="d273844594da563a28c607f77733e2141f72daa05b9f296500d2582410c90a0f" Mar 20 08:49:49 crc kubenswrapper[4903]: I0320 08:49:49.913319 4903 scope.go:117] "RemoveContainer" containerID="a8e86b614ac7f5338fb12a670fd38985e9a2c2f7e4f4a111a46686a3317b5790" Mar 20 08:49:51 crc kubenswrapper[4903]: I0320 08:49:51.500845 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" path="/var/lib/kubelet/pods/8005d467-6a20-4e68-b62f-65ad97a31812/volumes" Mar 20 08:49:52 crc kubenswrapper[4903]: E0320 08:49:52.074407 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:52 crc kubenswrapper[4903]: E0320 08:49:52.075263 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:52 crc kubenswrapper[4903]: E0320 08:49:52.075459 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:52 crc kubenswrapper[4903]: E0320 08:49:52.076183 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:52 crc kubenswrapper[4903]: E0320 08:49:52.076218 4903 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" Mar 20 08:49:52 crc kubenswrapper[4903]: E0320 08:49:52.079402 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:52 crc kubenswrapper[4903]: E0320 08:49:52.086174 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:52 crc kubenswrapper[4903]: E0320 08:49:52.086225 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovs-vswitchd" Mar 20 08:49:53 crc kubenswrapper[4903]: E0320 08:49:53.292472 4903 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 08:49:53 crc kubenswrapper[4903]: E0320 08:49:53.292518 4903 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 08:49:53 crc kubenswrapper[4903]: E0320 08:49:53.292528 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:49:53 crc kubenswrapper[4903]: E0320 08:49:53.292545 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 08:49:53 crc kubenswrapper[4903]: E0320 08:49:53.292624 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:50:09.292601102 +0000 UTC m=+1634.509501417 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 08:49:54 crc kubenswrapper[4903]: I0320 08:49:54.849105 4903 generic.go:334] "Generic (PLEG): container finished" podID="0790ef46-b8b6-4d5e-98a8-06319c232264" containerID="492d0160f992341b5c4f630fecea542dfc91408228e35e56c2fb187fb62dd5de" exitCode=0 Mar 20 08:49:54 crc kubenswrapper[4903]: I0320 08:49:54.849292 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdb5-dn8w8" event={"ID":"0790ef46-b8b6-4d5e-98a8-06319c232264","Type":"ContainerDied","Data":"492d0160f992341b5c4f630fecea542dfc91408228e35e56c2fb187fb62dd5de"} Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.310319 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.434359 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-ovndb-tls-certs\") pod \"0790ef46-b8b6-4d5e-98a8-06319c232264\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.435206 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-internal-tls-certs\") pod \"0790ef46-b8b6-4d5e-98a8-06319c232264\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.435266 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smrlj\" (UniqueName: \"kubernetes.io/projected/0790ef46-b8b6-4d5e-98a8-06319c232264-kube-api-access-smrlj\") pod \"0790ef46-b8b6-4d5e-98a8-06319c232264\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.435302 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-combined-ca-bundle\") pod \"0790ef46-b8b6-4d5e-98a8-06319c232264\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.435437 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-config\") pod \"0790ef46-b8b6-4d5e-98a8-06319c232264\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.435457 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-public-tls-certs\") pod \"0790ef46-b8b6-4d5e-98a8-06319c232264\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.435532 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-httpd-config\") pod \"0790ef46-b8b6-4d5e-98a8-06319c232264\" (UID: \"0790ef46-b8b6-4d5e-98a8-06319c232264\") " Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.442210 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0790ef46-b8b6-4d5e-98a8-06319c232264" (UID: "0790ef46-b8b6-4d5e-98a8-06319c232264"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.448277 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0790ef46-b8b6-4d5e-98a8-06319c232264-kube-api-access-smrlj" (OuterVolumeSpecName: "kube-api-access-smrlj") pod "0790ef46-b8b6-4d5e-98a8-06319c232264" (UID: "0790ef46-b8b6-4d5e-98a8-06319c232264"). InnerVolumeSpecName "kube-api-access-smrlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.480261 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0790ef46-b8b6-4d5e-98a8-06319c232264" (UID: "0790ef46-b8b6-4d5e-98a8-06319c232264"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.483507 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0790ef46-b8b6-4d5e-98a8-06319c232264" (UID: "0790ef46-b8b6-4d5e-98a8-06319c232264"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.496336 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-config" (OuterVolumeSpecName: "config") pod "0790ef46-b8b6-4d5e-98a8-06319c232264" (UID: "0790ef46-b8b6-4d5e-98a8-06319c232264"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.500867 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0790ef46-b8b6-4d5e-98a8-06319c232264" (UID: "0790ef46-b8b6-4d5e-98a8-06319c232264"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.518726 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0790ef46-b8b6-4d5e-98a8-06319c232264" (UID: "0790ef46-b8b6-4d5e-98a8-06319c232264"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.538347 4903 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.538387 4903 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.538399 4903 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.538410 4903 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.538419 4903 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.538429 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smrlj\" (UniqueName: \"kubernetes.io/projected/0790ef46-b8b6-4d5e-98a8-06319c232264-kube-api-access-smrlj\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.538439 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0790ef46-b8b6-4d5e-98a8-06319c232264-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.862131 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdb5-dn8w8" event={"ID":"0790ef46-b8b6-4d5e-98a8-06319c232264","Type":"ContainerDied","Data":"8e4f58e3b4585e11586634c29cf539f6d18961de4f85aa15be6d736846a8f499"} Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.862219 4903 scope.go:117] "RemoveContainer" containerID="e720dfe8b4aae682033533f903e6ba534df6bbb629a2136de2d974617f1cbb66" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.862288 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbdb5-dn8w8" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.899652 4903 scope.go:117] "RemoveContainer" containerID="492d0160f992341b5c4f630fecea542dfc91408228e35e56c2fb187fb62dd5de" Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.919353 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fbdb5-dn8w8"] Mar 20 08:49:55 crc kubenswrapper[4903]: I0320 08:49:55.930986 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fbdb5-dn8w8"] Mar 20 08:49:57 crc kubenswrapper[4903]: E0320 08:49:57.074009 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:57 crc kubenswrapper[4903]: E0320 08:49:57.075277 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:57 crc kubenswrapper[4903]: E0320 08:49:57.075821 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:49:57 crc kubenswrapper[4903]: E0320 08:49:57.075923 4903 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" Mar 20 08:49:57 crc kubenswrapper[4903]: E0320 08:49:57.076263 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:57 crc kubenswrapper[4903]: E0320 08:49:57.077994 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:57 crc kubenswrapper[4903]: E0320 08:49:57.080616 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:49:57 crc kubenswrapper[4903]: E0320 08:49:57.080670 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovs-vswitchd" Mar 20 08:49:57 crc kubenswrapper[4903]: I0320 08:49:57.508472 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0790ef46-b8b6-4d5e-98a8-06319c232264" path="/var/lib/kubelet/pods/0790ef46-b8b6-4d5e-98a8-06319c232264/volumes" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.166240 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566610-lhr5n"] Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167335 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" containerName="openstack-network-exporter" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167364 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" containerName="openstack-network-exporter" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167391 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" containerName="rabbitmq" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167404 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" containerName="rabbitmq" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167422 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31664b72-a142-4656-88e8-84dd0cf18647" containerName="barbican-keystone-listener-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167436 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="31664b72-a142-4656-88e8-84dd0cf18647" containerName="barbican-keystone-listener-log" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167450 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerName="placement-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167463 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerName="placement-log" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167487 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerName="cinder-api-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167499 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerName="cinder-api-log" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167513 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" containerName="glance-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167526 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" containerName="glance-log" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167540 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167552 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-api" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167566 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df937948-08c4-447c-9450-07221ce76552" containerName="setup-container" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167578 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="df937948-08c4-447c-9450-07221ce76552" containerName="setup-container" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167594 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerName="cinder-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167606 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerName="cinder-api" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167628 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="proxy-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167640 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="proxy-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167667 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="sg-core" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167681 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="sg-core" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167697 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34de9984-0547-4ba1-ae7d-5f8cc9196c26" containerName="memcached" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167709 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="34de9984-0547-4ba1-ae7d-5f8cc9196c26" containerName="memcached" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167731 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" containerName="cinder-scheduler" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167743 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" containerName="cinder-scheduler" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167766 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" containerName="ovn-northd" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167804 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" containerName="ovn-northd" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167826 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d570ab6f-6c5f-4255-b2ae-1966da262a0d" containerName="nova-cell0-conductor-conductor" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167841 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d570ab6f-6c5f-4255-b2ae-1966da262a0d" containerName="nova-cell0-conductor-conductor" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167858 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerName="nova-metadata-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167870 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerName="nova-metadata-log" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167886 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="ceilometer-notification-agent" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167900 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="ceilometer-notification-agent" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167917 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerName="nova-metadata-metadata" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167929 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerName="nova-metadata-metadata" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167946 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="ceilometer-central-agent" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167958 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="ceilometer-central-agent" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.167974 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bb60e5-f963-44ed-9e5e-76ca6da5c723" containerName="kube-state-metrics" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.167986 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bb60e5-f963-44ed-9e5e-76ca6da5c723" containerName="kube-state-metrics" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168010 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a68183-d440-4f89-887d-d2441d00c8e4" containerName="mysql-bootstrap" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168024 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a68183-d440-4f89-887d-d2441d00c8e4" containerName="mysql-bootstrap" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168073 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168085 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-log" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168111 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" containerName="glance-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168123 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" containerName="glance-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168167 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc127483-5a42-4eea-8b8c-8a1382dced05" containerName="nova-scheduler-scheduler" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168180 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc127483-5a42-4eea-8b8c-8a1382dced05" containerName="nova-scheduler-scheduler" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168199 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerName="barbican-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168211 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerName="barbican-api" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168228 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" containerName="probe" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168240 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" containerName="probe" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168258 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a68183-d440-4f89-887d-d2441d00c8e4" containerName="galera" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168271 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a68183-d440-4f89-887d-d2441d00c8e4" containerName="galera" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168287 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df937948-08c4-447c-9450-07221ce76552" containerName="rabbitmq" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168300 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="df937948-08c4-447c-9450-07221ce76552" containerName="rabbitmq" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168316 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e072c5e-0f44-4d24-bccc-b14bf61fa192" containerName="nova-cell1-conductor-conductor" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168330 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e072c5e-0f44-4d24-bccc-b14bf61fa192" containerName="nova-cell1-conductor-conductor" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168344 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" containerName="glance-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168356 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" containerName="glance-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168371 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31664b72-a142-4656-88e8-84dd0cf18647" containerName="barbican-keystone-listener" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168383 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="31664b72-a142-4656-88e8-84dd0cf18647" containerName="barbican-keystone-listener" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168400 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerName="placement-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168412 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerName="placement-api" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168434 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0790ef46-b8b6-4d5e-98a8-06319c232264" containerName="neutron-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168446 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0790ef46-b8b6-4d5e-98a8-06319c232264" containerName="neutron-api" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168462 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0790ef46-b8b6-4d5e-98a8-06319c232264" containerName="neutron-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168474 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="0790ef46-b8b6-4d5e-98a8-06319c232264" containerName="neutron-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168490 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" containerName="setup-container" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168503 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" containerName="setup-container" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168524 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" containerName="glance-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168537 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" containerName="glance-log" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168552 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94a513f-1b70-4705-af6c-3f71cb0e4272" containerName="keystone-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168564 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94a513f-1b70-4705-af6c-3f71cb0e4272" containerName="keystone-api" Mar 20 08:50:00 crc kubenswrapper[4903]: E0320 08:50:00.168582 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerName="barbican-api-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168595 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerName="barbican-api-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168846 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerName="nova-metadata-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168941 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="0790ef46-b8b6-4d5e-98a8-06319c232264" containerName="neutron-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168955 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerName="cinder-api-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168976 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5f160c-29e2-43d0-bb55-6969904b3a4e" containerName="nova-metadata-metadata" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.168992 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerName="barbican-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169014 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="888a3fd9-01f8-47b3-b1bb-f2b8b6b96509" containerName="rabbitmq" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169051 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" containerName="cinder-scheduler" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169075 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" containerName="ovn-northd" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169096 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" containerName="glance-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169114 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="df937948-08c4-447c-9450-07221ce76552" containerName="rabbitmq" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169127 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="proxy-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169155 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="34de9984-0547-4ba1-ae7d-5f8cc9196c26" containerName="memcached" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169172 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94a513f-1b70-4705-af6c-3f71cb0e4272" containerName="keystone-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169193 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc127483-5a42-4eea-8b8c-8a1382dced05" containerName="nova-scheduler-scheduler" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169216 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a68183-d440-4f89-887d-d2441d00c8e4" containerName="galera" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169236 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="31664b72-a142-4656-88e8-84dd0cf18647" containerName="barbican-keystone-listener-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169253 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerName="placement-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169367 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bb60e5-f963-44ed-9e5e-76ca6da5c723" containerName="kube-state-metrics" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169386 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="0790ef46-b8b6-4d5e-98a8-06319c232264" containerName="neutron-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169408 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169431 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="ceilometer-central-agent" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169447 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c33b2cd-e705-41cd-9e59-3dcbb0a55829" containerName="glance-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169461 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" containerName="glance-httpd" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169482 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d570ab6f-6c5f-4255-b2ae-1966da262a0d" containerName="nova-cell0-conductor-conductor" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169497 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169517 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcd96a1-71bb-480c-8387-0fca4d17bf33" containerName="placement-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169533 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e43a58-7c5f-49ac-a1f4-f2eddfd28c6b" containerName="openstack-network-exporter" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169552 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b5adcb-aed8-4cff-b3ec-02721df3937d" containerName="cinder-api" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169567 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="ceilometer-notification-agent" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169587 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2cbf77-9511-47a0-9bc7-dd5eba7d8d8a" containerName="probe" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169609 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8005d467-6a20-4e68-b62f-65ad97a31812" containerName="sg-core" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169630 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e1dbd8-6ecd-4cd7-910b-910cf6a45679" containerName="glance-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169642 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="31664b72-a142-4656-88e8-84dd0cf18647" containerName="barbican-keystone-listener" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169657 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae081ad-c434-4d1b-ae9f-cc8c5b6e2f33" containerName="barbican-api-log" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.169676 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e072c5e-0f44-4d24-bccc-b14bf61fa192" containerName="nova-cell1-conductor-conductor" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.170460 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566610-lhr5n" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.174891 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.175501 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.176694 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.186087 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566610-lhr5n"] Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.324442 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpdxl\" (UniqueName: \"kubernetes.io/projected/46d6c81a-c1cb-48a7-95a1-a957c6a0fbea-kube-api-access-lpdxl\") pod \"auto-csr-approver-29566610-lhr5n\" (UID: \"46d6c81a-c1cb-48a7-95a1-a957c6a0fbea\") " pod="openshift-infra/auto-csr-approver-29566610-lhr5n" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.426109 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpdxl\" (UniqueName: \"kubernetes.io/projected/46d6c81a-c1cb-48a7-95a1-a957c6a0fbea-kube-api-access-lpdxl\") pod \"auto-csr-approver-29566610-lhr5n\" (UID: \"46d6c81a-c1cb-48a7-95a1-a957c6a0fbea\") " pod="openshift-infra/auto-csr-approver-29566610-lhr5n" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.454954 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpdxl\" (UniqueName: \"kubernetes.io/projected/46d6c81a-c1cb-48a7-95a1-a957c6a0fbea-kube-api-access-lpdxl\") pod \"auto-csr-approver-29566610-lhr5n\" (UID: \"46d6c81a-c1cb-48a7-95a1-a957c6a0fbea\") " pod="openshift-infra/auto-csr-approver-29566610-lhr5n" Mar 20 08:50:00 crc kubenswrapper[4903]: I0320 08:50:00.509846 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566610-lhr5n" Mar 20 08:50:01 crc kubenswrapper[4903]: I0320 08:50:01.153514 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566610-lhr5n"] Mar 20 08:50:01 crc kubenswrapper[4903]: I0320 08:50:01.338958 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:50:01 crc kubenswrapper[4903]: I0320 08:50:01.340184 4903 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="877f943b-808c-435e-a5cf-bda8ea0a5d15" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 08:50:01 crc kubenswrapper[4903]: I0320 08:50:01.936341 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566610-lhr5n" event={"ID":"46d6c81a-c1cb-48a7-95a1-a957c6a0fbea","Type":"ContainerStarted","Data":"de9a09ae8fad365f9cca78706502d9d805ea80efdc3a37ba381a2a054e57e0f6"} Mar 20 08:50:02 crc kubenswrapper[4903]: E0320 08:50:02.073279 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:50:02 crc kubenswrapper[4903]: E0320 08:50:02.074137 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:50:02 crc kubenswrapper[4903]: E0320 08:50:02.074753 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:50:02 crc kubenswrapper[4903]: E0320 08:50:02.074839 4903 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" Mar 20 08:50:02 crc kubenswrapper[4903]: E0320 08:50:02.077580 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:50:02 crc kubenswrapper[4903]: E0320 08:50:02.079879 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:50:02 crc kubenswrapper[4903]: E0320 08:50:02.082553 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:50:02 crc kubenswrapper[4903]: E0320 08:50:02.082669 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovs-vswitchd" Mar 20 08:50:02 crc kubenswrapper[4903]: I0320 08:50:02.944920 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566610-lhr5n" event={"ID":"46d6c81a-c1cb-48a7-95a1-a957c6a0fbea","Type":"ContainerStarted","Data":"4c59dac9b62c965032f39e120b558d55d2f4e5a101d1aaa889af3eacbae8630e"} Mar 20 08:50:03 crc kubenswrapper[4903]: I0320 08:50:03.960908 4903 generic.go:334] "Generic (PLEG): container finished" podID="46d6c81a-c1cb-48a7-95a1-a957c6a0fbea" containerID="4c59dac9b62c965032f39e120b558d55d2f4e5a101d1aaa889af3eacbae8630e" exitCode=0 Mar 20 08:50:03 crc kubenswrapper[4903]: I0320 08:50:03.960953 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566610-lhr5n" event={"ID":"46d6c81a-c1cb-48a7-95a1-a957c6a0fbea","Type":"ContainerDied","Data":"4c59dac9b62c965032f39e120b558d55d2f4e5a101d1aaa889af3eacbae8630e"} Mar 20 08:50:05 crc kubenswrapper[4903]: I0320 08:50:05.368345 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566610-lhr5n" Mar 20 08:50:05 crc kubenswrapper[4903]: I0320 08:50:05.418345 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpdxl\" (UniqueName: \"kubernetes.io/projected/46d6c81a-c1cb-48a7-95a1-a957c6a0fbea-kube-api-access-lpdxl\") pod \"46d6c81a-c1cb-48a7-95a1-a957c6a0fbea\" (UID: \"46d6c81a-c1cb-48a7-95a1-a957c6a0fbea\") " Mar 20 08:50:05 crc kubenswrapper[4903]: I0320 08:50:05.427159 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d6c81a-c1cb-48a7-95a1-a957c6a0fbea-kube-api-access-lpdxl" (OuterVolumeSpecName: "kube-api-access-lpdxl") pod "46d6c81a-c1cb-48a7-95a1-a957c6a0fbea" (UID: "46d6c81a-c1cb-48a7-95a1-a957c6a0fbea"). InnerVolumeSpecName "kube-api-access-lpdxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:05 crc kubenswrapper[4903]: I0320 08:50:05.529863 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpdxl\" (UniqueName: \"kubernetes.io/projected/46d6c81a-c1cb-48a7-95a1-a957c6a0fbea-kube-api-access-lpdxl\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:05 crc kubenswrapper[4903]: I0320 08:50:05.980300 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566610-lhr5n" event={"ID":"46d6c81a-c1cb-48a7-95a1-a957c6a0fbea","Type":"ContainerDied","Data":"de9a09ae8fad365f9cca78706502d9d805ea80efdc3a37ba381a2a054e57e0f6"} Mar 20 08:50:05 crc kubenswrapper[4903]: I0320 08:50:05.980344 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9a09ae8fad365f9cca78706502d9d805ea80efdc3a37ba381a2a054e57e0f6" Mar 20 08:50:05 crc kubenswrapper[4903]: I0320 08:50:05.980352 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566610-lhr5n" Mar 20 08:50:06 crc kubenswrapper[4903]: I0320 08:50:06.035823 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566604-q559d"] Mar 20 08:50:06 crc kubenswrapper[4903]: I0320 08:50:06.041624 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566604-q559d"] Mar 20 08:50:07 crc kubenswrapper[4903]: E0320 08:50:07.073317 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:50:07 crc kubenswrapper[4903]: E0320 08:50:07.074475 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:50:07 crc kubenswrapper[4903]: E0320 08:50:07.075300 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:50:07 crc kubenswrapper[4903]: E0320 08:50:07.075427 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 08:50:07 crc kubenswrapper[4903]: E0320 08:50:07.075468 4903 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" Mar 20 08:50:07 crc kubenswrapper[4903]: E0320 08:50:07.079475 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:50:07 crc kubenswrapper[4903]: E0320 08:50:07.081867 4903 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 08:50:07 crc kubenswrapper[4903]: E0320 08:50:07.082162 4903 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-chrhv" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovs-vswitchd" Mar 20 08:50:07 crc kubenswrapper[4903]: I0320 08:50:07.508187 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dacbc55-264e-4175-a03b-f1f4d135ef5e" path="/var/lib/kubelet/pods/1dacbc55-264e-4175-a03b-f1f4d135ef5e/volumes" Mar 20 08:50:09 crc kubenswrapper[4903]: E0320 08:50:09.395282 4903 projected.go:288] Couldn't get configMap openstack/swift-storage-config-data: configmap "swift-storage-config-data" not found Mar 20 08:50:09 crc kubenswrapper[4903]: E0320 08:50:09.395406 4903 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Mar 20 08:50:09 crc kubenswrapper[4903]: E0320 08:50:09.395464 4903 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 08:50:09 crc kubenswrapper[4903]: E0320 08:50:09.395488 4903 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 08:50:09 crc kubenswrapper[4903]: E0320 08:50:09.395595 4903 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift podName:ccedd84e-d0d0-40b8-812c-3a57b41aee98 nodeName:}" failed. No retries permitted until 2026-03-20 08:50:41.395561892 +0000 UTC m=+1666.612462247 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift") pod "swift-storage-0" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98") : [configmap "swift-storage-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.034328 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-chrhv_d69915e4-0df8-4d83-b096-962eadc1883f/ovs-vswitchd/0.log" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.035924 4903 generic.go:334] "Generic (PLEG): container finished" podID="d69915e4-0df8-4d83-b096-962eadc1883f" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" exitCode=137 Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.035969 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-chrhv" event={"ID":"d69915e4-0df8-4d83-b096-962eadc1883f","Type":"ContainerDied","Data":"cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324"} Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.180566 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-chrhv_d69915e4-0df8-4d83-b096-962eadc1883f/ovs-vswitchd/0.log" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.181429 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.308547 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-etc-ovs\") pod \"d69915e4-0df8-4d83-b096-962eadc1883f\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309065 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-log\") pod \"d69915e4-0df8-4d83-b096-962eadc1883f\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.308808 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "d69915e4-0df8-4d83-b096-962eadc1883f" (UID: "d69915e4-0df8-4d83-b096-962eadc1883f"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309218 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-run\") pod \"d69915e4-0df8-4d83-b096-962eadc1883f\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309253 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9v2l\" (UniqueName: \"kubernetes.io/projected/d69915e4-0df8-4d83-b096-962eadc1883f-kube-api-access-d9v2l\") pod \"d69915e4-0df8-4d83-b096-962eadc1883f\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309242 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-log" (OuterVolumeSpecName: "var-log") pod "d69915e4-0df8-4d83-b096-962eadc1883f" (UID: "d69915e4-0df8-4d83-b096-962eadc1883f"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309325 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69915e4-0df8-4d83-b096-962eadc1883f-scripts\") pod \"d69915e4-0df8-4d83-b096-962eadc1883f\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309351 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-lib\") pod \"d69915e4-0df8-4d83-b096-962eadc1883f\" (UID: \"d69915e4-0df8-4d83-b096-962eadc1883f\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309323 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-run" (OuterVolumeSpecName: "var-run") pod "d69915e4-0df8-4d83-b096-962eadc1883f" (UID: "d69915e4-0df8-4d83-b096-962eadc1883f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309464 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-lib" (OuterVolumeSpecName: "var-lib") pod "d69915e4-0df8-4d83-b096-962eadc1883f" (UID: "d69915e4-0df8-4d83-b096-962eadc1883f"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309697 4903 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309713 4903 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-lib\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309725 4903 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.309736 4903 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d69915e4-0df8-4d83-b096-962eadc1883f-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.311567 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d69915e4-0df8-4d83-b096-962eadc1883f-scripts" (OuterVolumeSpecName: "scripts") pod "d69915e4-0df8-4d83-b096-962eadc1883f" (UID: "d69915e4-0df8-4d83-b096-962eadc1883f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.316070 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69915e4-0df8-4d83-b096-962eadc1883f-kube-api-access-d9v2l" (OuterVolumeSpecName: "kube-api-access-d9v2l") pod "d69915e4-0df8-4d83-b096-962eadc1883f" (UID: "d69915e4-0df8-4d83-b096-962eadc1883f"). InnerVolumeSpecName "kube-api-access-d9v2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.411777 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9v2l\" (UniqueName: \"kubernetes.io/projected/d69915e4-0df8-4d83-b096-962eadc1883f-kube-api-access-d9v2l\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.411842 4903 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d69915e4-0df8-4d83-b096-962eadc1883f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.423401 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.513697 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvm8x\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-kube-api-access-tvm8x\") pod \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.513800 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.513906 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") pod \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.513974 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-lock\") pod \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.514025 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccedd84e-d0d0-40b8-812c-3a57b41aee98-combined-ca-bundle\") pod \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.514131 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-cache\") pod \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\" (UID: \"ccedd84e-d0d0-40b8-812c-3a57b41aee98\") " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.515260 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-lock" (OuterVolumeSpecName: "lock") pod "ccedd84e-d0d0-40b8-812c-3a57b41aee98" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.515404 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-cache" (OuterVolumeSpecName: "cache") pod "ccedd84e-d0d0-40b8-812c-3a57b41aee98" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.517272 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "ccedd84e-d0d0-40b8-812c-3a57b41aee98" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.517292 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-kube-api-access-tvm8x" (OuterVolumeSpecName: "kube-api-access-tvm8x") pod "ccedd84e-d0d0-40b8-812c-3a57b41aee98" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98"). InnerVolumeSpecName "kube-api-access-tvm8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.519498 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ccedd84e-d0d0-40b8-812c-3a57b41aee98" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.616543 4903 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-cache\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.616966 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvm8x\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-kube-api-access-tvm8x\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.617204 4903 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.617341 4903 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccedd84e-d0d0-40b8-812c-3a57b41aee98-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.617447 4903 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccedd84e-d0d0-40b8-812c-3a57b41aee98-lock\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.637467 4903 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.719373 4903 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.878182 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccedd84e-d0d0-40b8-812c-3a57b41aee98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccedd84e-d0d0-40b8-812c-3a57b41aee98" (UID: "ccedd84e-d0d0-40b8-812c-3a57b41aee98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 08:50:10 crc kubenswrapper[4903]: I0320 08:50:10.921690 4903 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccedd84e-d0d0-40b8-812c-3a57b41aee98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.048588 4903 generic.go:334] "Generic (PLEG): container finished" podID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerID="9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b" exitCode=137 Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.048667 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b"} Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.048703 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.049052 4903 scope.go:117] "RemoveContainer" containerID="9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.048981 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccedd84e-d0d0-40b8-812c-3a57b41aee98","Type":"ContainerDied","Data":"4cad09a2e774c6f24fe9e4fb211cec9f3ae73443f7f2b6604592174ed6c9ff3d"} Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.050160 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-chrhv_d69915e4-0df8-4d83-b096-962eadc1883f/ovs-vswitchd/0.log" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.051429 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-chrhv" event={"ID":"d69915e4-0df8-4d83-b096-962eadc1883f","Type":"ContainerDied","Data":"296a07fd3728917cdd1ba15c6f90eb1426775d773d30351e992214c75ce92029"} Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.051500 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-chrhv" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.071819 4903 scope.go:117] "RemoveContainer" containerID="aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.096220 4903 scope.go:117] "RemoveContainer" containerID="970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.096350 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.106894 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.112908 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-chrhv"] Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.125913 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-chrhv"] Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.137906 4903 scope.go:117] "RemoveContainer" containerID="9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.162112 4903 scope.go:117] "RemoveContainer" containerID="ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.180322 4903 scope.go:117] "RemoveContainer" containerID="68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.199615 4903 scope.go:117] "RemoveContainer" containerID="c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.218148 4903 scope.go:117] "RemoveContainer" containerID="1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.233420 4903 scope.go:117] "RemoveContainer" containerID="0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.253251 4903 scope.go:117] "RemoveContainer" containerID="8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.274009 4903 scope.go:117] "RemoveContainer" containerID="ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.300910 4903 scope.go:117] "RemoveContainer" containerID="92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.324363 4903 scope.go:117] "RemoveContainer" containerID="b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.354822 4903 scope.go:117] "RemoveContainer" containerID="2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.378392 4903 scope.go:117] "RemoveContainer" containerID="ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.405880 4903 scope.go:117] "RemoveContainer" containerID="9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.407088 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b\": container with ID starting with 9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b not found: ID does not exist" containerID="9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.407161 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b"} err="failed to get container status \"9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b\": rpc error: code = NotFound desc = could not find container \"9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b\": container with ID starting with 9b7b4b408c9bc763af97601ea17d444715b1766bc513d62fcbbcb137a6f2421b not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.407223 4903 scope.go:117] "RemoveContainer" containerID="aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.407640 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622\": container with ID starting with aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622 not found: ID does not exist" containerID="aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.407685 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622"} err="failed to get container status \"aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622\": rpc error: code = NotFound desc = could not find container \"aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622\": container with ID starting with aa0b2e935a516f1aabd311732420c76fdecf78152f5d8c3cae1c496ba8803622 not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.407712 4903 scope.go:117] "RemoveContainer" containerID="970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.408215 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5\": container with ID starting with 970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5 not found: ID does not exist" containerID="970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.408259 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5"} err="failed to get container status \"970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5\": rpc error: code = NotFound desc = could not find container \"970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5\": container with ID starting with 970a73f848705640c8c23ca337e77c32e8681ce1576bc3f7acd53b211ca858c5 not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.408285 4903 scope.go:117] "RemoveContainer" containerID="9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.408624 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103\": container with ID starting with 9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103 not found: ID does not exist" containerID="9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.408662 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103"} err="failed to get container status \"9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103\": rpc error: code = NotFound desc = could not find container \"9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103\": container with ID starting with 9b346edd52808b80170d3bc7c974747d08d86901f4b138ca360989217b0d0103 not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.408693 4903 scope.go:117] "RemoveContainer" containerID="ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.409007 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda\": container with ID starting with ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda not found: ID does not exist" containerID="ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.409086 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda"} err="failed to get container status \"ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda\": rpc error: code = NotFound desc = could not find container \"ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda\": container with ID starting with ed2c689c76fb63fb9426546d2a29ec1ece23e59b99d50e58c33c34a410549fda not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.409127 4903 scope.go:117] "RemoveContainer" containerID="68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.409433 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da\": container with ID starting with 68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da not found: ID does not exist" containerID="68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.409474 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da"} err="failed to get container status \"68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da\": rpc error: code = NotFound desc = could not find container \"68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da\": container with ID starting with 68157cdfcfdbaa0b50499b9130ec297080cdd86c764418be16c6a47f7dafe3da not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.409500 4903 scope.go:117] "RemoveContainer" containerID="c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.409761 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186\": container with ID starting with c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186 not found: ID does not exist" containerID="c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.409796 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186"} err="failed to get container status \"c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186\": rpc error: code = NotFound desc = could not find container \"c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186\": container with ID starting with c4da8cefa47622803dd1e647814f8e48127fdf53fd6d3ee4863912ef6363b186 not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.409823 4903 scope.go:117] "RemoveContainer" containerID="1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.410159 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8\": container with ID starting with 1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8 not found: ID does not exist" containerID="1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.410207 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8"} err="failed to get container status \"1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8\": rpc error: code = NotFound desc = could not find container \"1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8\": container with ID starting with 1b28fb8da4267bf115856680b9313379defdf659be453428306ad98f35f6d6f8 not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.410231 4903 scope.go:117] "RemoveContainer" containerID="0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.410506 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c\": container with ID starting with 0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c not found: ID does not exist" containerID="0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.410552 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c"} err="failed to get container status \"0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c\": rpc error: code = NotFound desc = could not find container \"0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c\": container with ID starting with 0a04473525eb06e22984df1a504ad70e91e54494ecc7880aec725f810f1f575c not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.410587 4903 scope.go:117] "RemoveContainer" containerID="8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.410865 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4\": container with ID starting with 8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4 not found: ID does not exist" containerID="8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.410908 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4"} err="failed to get container status \"8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4\": rpc error: code = NotFound desc = could not find container \"8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4\": container with ID starting with 8fc4f339bf9aa2a0245e4c43fd890e2315110e976b25e56bad0be56f10d8abd4 not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.410936 4903 scope.go:117] "RemoveContainer" containerID="ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.411431 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e\": container with ID starting with ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e not found: ID does not exist" containerID="ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.411474 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e"} err="failed to get container status \"ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e\": rpc error: code = NotFound desc = could not find container \"ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e\": container with ID starting with ac6a6d8fff3600147cf08f30230b0e513eb967f730ef636689918b9d11d7980e not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.411498 4903 scope.go:117] "RemoveContainer" containerID="92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.411829 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7\": container with ID starting with 92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7 not found: ID does not exist" containerID="92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.411873 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7"} err="failed to get container status \"92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7\": rpc error: code = NotFound desc = could not find container \"92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7\": container with ID starting with 92af1107195bebc153bf9352c2eb36552b3f9f73d72391de8b4a6f21b80a4bb7 not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.411897 4903 scope.go:117] "RemoveContainer" containerID="b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.412238 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2\": container with ID starting with b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2 not found: ID does not exist" containerID="b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.412280 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2"} err="failed to get container status \"b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2\": rpc error: code = NotFound desc = could not find container \"b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2\": container with ID starting with b55e63c14895bcdb40403401521699e38dfe6a783a3590019bd4fb0770fe17f2 not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.412309 4903 scope.go:117] "RemoveContainer" containerID="2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.412764 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45\": container with ID starting with 2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45 not found: ID does not exist" containerID="2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.412802 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45"} err="failed to get container status \"2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45\": rpc error: code = NotFound desc = could not find container \"2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45\": container with ID starting with 2dff8db1a526f1e028fa24f1cf0a7a3b6053fb85fa1e7ea943870e339ab8ef45 not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.412825 4903 scope.go:117] "RemoveContainer" containerID="ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88" Mar 20 08:50:11 crc kubenswrapper[4903]: E0320 08:50:11.413163 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88\": container with ID starting with ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88 not found: ID does not exist" containerID="ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.413207 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88"} err="failed to get container status \"ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88\": rpc error: code = NotFound desc = could not find container \"ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88\": container with ID starting with ab944c856301685ebd77b2e7fef75ffca3eac6f59b30dda9a10daef1b8c70e88 not found: ID does not exist" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.413235 4903 scope.go:117] "RemoveContainer" containerID="cbe32b8386815ecd924f4abbb568339471c7efea68305c548d14baa5ac7f2324" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.432966 4903 scope.go:117] "RemoveContainer" containerID="ec70b5d9df40c275990e6b10b611cf7497dcf53a4d72ca11805f6bf11c5bc677" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.453540 4903 scope.go:117] "RemoveContainer" containerID="ef6aa1fdb1f8f9b74915a49b26f2800fbc354505df15acabc86a9628fcc4c35f" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.524971 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" path="/var/lib/kubelet/pods/ccedd84e-d0d0-40b8-812c-3a57b41aee98/volumes" Mar 20 08:50:11 crc kubenswrapper[4903]: I0320 08:50:11.527376 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" path="/var/lib/kubelet/pods/d69915e4-0df8-4d83-b096-962eadc1883f/volumes" Mar 20 08:50:59 crc kubenswrapper[4903]: I0320 08:50:59.689810 4903 scope.go:117] "RemoveContainer" containerID="eb3dbc6f45f101429c739d7777bc9d208e66f80da26203fea7b6eb31ff6fd490" Mar 20 08:50:59 crc kubenswrapper[4903]: I0320 08:50:59.743090 4903 scope.go:117] "RemoveContainer" containerID="3f80ef0529df9f4da7f79cb6a6b5e235388ced9a4a663185ccd21fd5f820b009" Mar 20 08:50:59 crc kubenswrapper[4903]: I0320 08:50:59.806760 4903 scope.go:117] "RemoveContainer" containerID="17d52af7be15bf2eb59faf29a775496681d100d63125270b5888714485eae90c" Mar 20 08:50:59 crc kubenswrapper[4903]: I0320 08:50:59.828473 4903 scope.go:117] "RemoveContainer" containerID="6bedc484fc1ddfc3818fe8a5b573155e069bccef17ec805a7bdd654f8220e9b8" Mar 20 08:50:59 crc kubenswrapper[4903]: I0320 08:50:59.862177 4903 scope.go:117] "RemoveContainer" containerID="2a90a2925a40aac67b4421843577ba8b3be4d2b390467377735fe14ae6a6c32f" Mar 20 08:50:59 crc kubenswrapper[4903]: I0320 08:50:59.889690 4903 scope.go:117] "RemoveContainer" containerID="2be0f124027cc325dc4a812c2195ae34353c0f3552687b7f56a5dd08cd7ffdbd" Mar 20 08:50:59 crc kubenswrapper[4903]: I0320 08:50:59.918060 4903 scope.go:117] "RemoveContainer" containerID="24c424c306f16903d97df8a740c0324e47598833553f2a3494098232c957427d" Mar 20 08:50:59 crc kubenswrapper[4903]: I0320 08:50:59.943979 4903 scope.go:117] "RemoveContainer" containerID="565e3db7a9d288f5296940d2760d8a4d808442910047f2d8fa967bf716192bb5" Mar 20 08:50:59 crc kubenswrapper[4903]: I0320 08:50:59.968295 4903 scope.go:117] "RemoveContainer" containerID="2e4006812e1ded1b8c19b4cfe7112d6c217a64e73fe57851046d794a40a06e9c" Mar 20 08:50:59 crc kubenswrapper[4903]: I0320 08:50:59.992123 4903 scope.go:117] "RemoveContainer" containerID="4b25264630ca383860adbcdd6a815dbc27ba77766601fe09e1c25af53f7d43d9" Mar 20 08:51:00 crc kubenswrapper[4903]: I0320 08:51:00.030459 4903 scope.go:117] "RemoveContainer" containerID="5579e258c969d007185aad27798c30763e8bd565e14091159bc558d34757c14e" Mar 20 08:51:00 crc kubenswrapper[4903]: I0320 08:51:00.067081 4903 scope.go:117] "RemoveContainer" containerID="390d83024e4e69985e2610318a454c1ce59f0eb179355db078862b76953ea0d2" Mar 20 08:51:00 crc kubenswrapper[4903]: I0320 08:51:00.087747 4903 scope.go:117] "RemoveContainer" containerID="ae7ef07f6710e3c50df8fdfb347b54fac262a0797dc427c28c37c346ed91d089" Mar 20 08:51:00 crc kubenswrapper[4903]: I0320 08:51:00.110632 4903 scope.go:117] "RemoveContainer" containerID="315928000ce5771f2b1094d24598a1282273c9d543fd01f4d6e5739b56bc3bab" Mar 20 08:51:00 crc kubenswrapper[4903]: I0320 08:51:00.140121 4903 scope.go:117] "RemoveContainer" containerID="daa45dc24f3e4bf07a60054cf5038e0a35924ee589ba20dd9f4b1795657490b2" Mar 20 08:51:00 crc kubenswrapper[4903]: I0320 08:51:00.166706 4903 scope.go:117] "RemoveContainer" containerID="81bf6290e0a3600cb73e207f93b56e418e5fa5670d3a82deb255b3f2895becbc" Mar 20 08:51:00 crc kubenswrapper[4903]: I0320 08:51:00.194528 4903 scope.go:117] "RemoveContainer" containerID="c63eea97f0c9e2d4513fd116c4d53a68a48e802327a50bbf881c9a689353d8fa" Mar 20 08:51:00 crc kubenswrapper[4903]: I0320 08:51:00.219730 4903 scope.go:117] "RemoveContainer" containerID="03a018cef030a0be0be44e08280bdfaf6515a35b4883ba9ce657fe82954c5842" Mar 20 08:51:50 crc kubenswrapper[4903]: I0320 08:51:50.834342 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:51:50 crc kubenswrapper[4903]: I0320 08:51:50.834998 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.180170 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566612-jw6xc"] Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181154 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-expirer" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181173 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-expirer" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181188 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-server" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181198 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-server" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181210 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-updater" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181219 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-updater" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181240 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server-init" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181251 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server-init" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181270 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="swift-recon-cron" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181280 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="swift-recon-cron" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181296 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-replicator" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181304 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-replicator" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181323 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-server" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181333 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-server" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181351 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-updater" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181361 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-updater" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181375 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-replicator" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181384 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-replicator" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181398 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-auditor" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181407 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-auditor" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181423 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-replicator" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181432 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-replicator" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181444 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="rsync" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181453 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="rsync" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181463 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-server" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181474 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-server" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181488 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181499 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181518 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovs-vswitchd" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181527 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovs-vswitchd" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181546 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d6c81a-c1cb-48a7-95a1-a957c6a0fbea" containerName="oc" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181556 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d6c81a-c1cb-48a7-95a1-a957c6a0fbea" containerName="oc" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181567 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-auditor" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181577 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-auditor" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181589 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-reaper" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181599 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-reaper" Mar 20 08:52:00 crc kubenswrapper[4903]: E0320 08:52:00.181613 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-auditor" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181622 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-auditor" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181828 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovs-vswitchd" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181854 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-updater" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181863 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="swift-recon-cron" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181877 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-server" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181895 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-auditor" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181908 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-updater" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181918 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-server" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181937 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-reaper" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181948 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-replicator" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181965 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d6c81a-c1cb-48a7-95a1-a957c6a0fbea" containerName="oc" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181978 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-expirer" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.181995 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-replicator" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.182011 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-server" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.182027 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="object-replicator" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.182073 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69915e4-0df8-4d83-b096-962eadc1883f" containerName="ovsdb-server" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.182089 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="container-auditor" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.182102 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="account-auditor" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.182119 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccedd84e-d0d0-40b8-812c-3a57b41aee98" containerName="rsync" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.182768 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566612-jw6xc" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.184964 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.185257 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.185440 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.200542 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566612-jw6xc"] Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.296983 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmqvq\" (UniqueName: \"kubernetes.io/projected/82f5c42c-0fe9-4880-9d58-a683d748b424-kube-api-access-hmqvq\") pod \"auto-csr-approver-29566612-jw6xc\" (UID: \"82f5c42c-0fe9-4880-9d58-a683d748b424\") " pod="openshift-infra/auto-csr-approver-29566612-jw6xc" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.398482 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmqvq\" (UniqueName: \"kubernetes.io/projected/82f5c42c-0fe9-4880-9d58-a683d748b424-kube-api-access-hmqvq\") pod \"auto-csr-approver-29566612-jw6xc\" (UID: \"82f5c42c-0fe9-4880-9d58-a683d748b424\") " pod="openshift-infra/auto-csr-approver-29566612-jw6xc" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.431810 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmqvq\" (UniqueName: \"kubernetes.io/projected/82f5c42c-0fe9-4880-9d58-a683d748b424-kube-api-access-hmqvq\") pod \"auto-csr-approver-29566612-jw6xc\" (UID: \"82f5c42c-0fe9-4880-9d58-a683d748b424\") " pod="openshift-infra/auto-csr-approver-29566612-jw6xc" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.509403 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566612-jw6xc" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.605294 4903 scope.go:117] "RemoveContainer" containerID="9c75fa540c0ceff28035719c4c9810d7dcb95337b1bd588ad42d6c195849ec20" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.685752 4903 scope.go:117] "RemoveContainer" containerID="4436cc84a4eb5d16d50594541e6aa5ecc7f4b291c369bb3dffbc3aee6052c237" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.726079 4903 scope.go:117] "RemoveContainer" containerID="4f489dcee69a113289e6c636cdf34b9bb86247d0274a95e46fc1731cbaeacb45" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.768587 4903 scope.go:117] "RemoveContainer" containerID="f1585936de585b08572047adfaea7713e1e4f1a92c343c05347a48965f635c40" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.813273 4903 scope.go:117] "RemoveContainer" containerID="f63032427faa40de3f4e75fa0bf189cc5949d6ee6ff43b3dc51e606412311d6f" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.846659 4903 scope.go:117] "RemoveContainer" containerID="0269026cf7982aeeca2edca95914706d78713f2ca9351fcff90c8fced514cb93" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.868517 4903 scope.go:117] "RemoveContainer" containerID="dbce3ffc947f1f39ec5ef071fa419c575a3bcb33a8993c03be727d2e31281d58" Mar 20 08:52:00 crc kubenswrapper[4903]: I0320 08:52:00.978152 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566612-jw6xc"] Mar 20 08:52:01 crc kubenswrapper[4903]: I0320 08:52:01.247535 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566612-jw6xc" event={"ID":"82f5c42c-0fe9-4880-9d58-a683d748b424","Type":"ContainerStarted","Data":"c9fe0657e7da30861a5b994f3a466a3d75f09e346b60da58e5a30cd63e2322e1"} Mar 20 08:52:03 crc kubenswrapper[4903]: I0320 08:52:03.268096 4903 generic.go:334] "Generic (PLEG): container finished" podID="82f5c42c-0fe9-4880-9d58-a683d748b424" containerID="b868cda2fbba12d39d2c4a4c30d9c08efe46ce6fb9af3c55eeafad8f7cac2afd" exitCode=0 Mar 20 08:52:03 crc kubenswrapper[4903]: I0320 08:52:03.268202 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566612-jw6xc" event={"ID":"82f5c42c-0fe9-4880-9d58-a683d748b424","Type":"ContainerDied","Data":"b868cda2fbba12d39d2c4a4c30d9c08efe46ce6fb9af3c55eeafad8f7cac2afd"} Mar 20 08:52:04 crc kubenswrapper[4903]: I0320 08:52:04.681802 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566612-jw6xc" Mar 20 08:52:04 crc kubenswrapper[4903]: I0320 08:52:04.869262 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmqvq\" (UniqueName: \"kubernetes.io/projected/82f5c42c-0fe9-4880-9d58-a683d748b424-kube-api-access-hmqvq\") pod \"82f5c42c-0fe9-4880-9d58-a683d748b424\" (UID: \"82f5c42c-0fe9-4880-9d58-a683d748b424\") " Mar 20 08:52:04 crc kubenswrapper[4903]: I0320 08:52:04.879191 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f5c42c-0fe9-4880-9d58-a683d748b424-kube-api-access-hmqvq" (OuterVolumeSpecName: "kube-api-access-hmqvq") pod "82f5c42c-0fe9-4880-9d58-a683d748b424" (UID: "82f5c42c-0fe9-4880-9d58-a683d748b424"). InnerVolumeSpecName "kube-api-access-hmqvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:52:04 crc kubenswrapper[4903]: I0320 08:52:04.971810 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmqvq\" (UniqueName: \"kubernetes.io/projected/82f5c42c-0fe9-4880-9d58-a683d748b424-kube-api-access-hmqvq\") on node \"crc\" DevicePath \"\"" Mar 20 08:52:05 crc kubenswrapper[4903]: I0320 08:52:05.290352 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566612-jw6xc" event={"ID":"82f5c42c-0fe9-4880-9d58-a683d748b424","Type":"ContainerDied","Data":"c9fe0657e7da30861a5b994f3a466a3d75f09e346b60da58e5a30cd63e2322e1"} Mar 20 08:52:05 crc kubenswrapper[4903]: I0320 08:52:05.290769 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9fe0657e7da30861a5b994f3a466a3d75f09e346b60da58e5a30cd63e2322e1" Mar 20 08:52:05 crc kubenswrapper[4903]: I0320 08:52:05.290433 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566612-jw6xc" Mar 20 08:52:05 crc kubenswrapper[4903]: I0320 08:52:05.761341 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566606-6n2pc"] Mar 20 08:52:05 crc kubenswrapper[4903]: I0320 08:52:05.766170 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566606-6n2pc"] Mar 20 08:52:07 crc kubenswrapper[4903]: I0320 08:52:07.506682 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70a601b8-5c2b-4b68-a9b1-d1434eab6965" path="/var/lib/kubelet/pods/70a601b8-5c2b-4b68-a9b1-d1434eab6965/volumes" Mar 20 08:52:20 crc kubenswrapper[4903]: I0320 08:52:20.833560 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:52:20 crc kubenswrapper[4903]: I0320 08:52:20.835296 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:52:50 crc kubenswrapper[4903]: I0320 08:52:50.834659 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 08:52:50 crc kubenswrapper[4903]: I0320 08:52:50.835699 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 08:52:50 crc kubenswrapper[4903]: I0320 08:52:50.835780 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 08:52:50 crc kubenswrapper[4903]: I0320 08:52:50.836896 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 08:52:50 crc kubenswrapper[4903]: I0320 08:52:50.837009 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" gracePeriod=600 Mar 20 08:52:50 crc kubenswrapper[4903]: E0320 08:52:50.983982 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:52:51 crc kubenswrapper[4903]: I0320 08:52:51.763334 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" exitCode=0 Mar 20 08:52:51 crc kubenswrapper[4903]: I0320 08:52:51.763410 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9"} Mar 20 08:52:51 crc kubenswrapper[4903]: I0320 08:52:51.763508 4903 scope.go:117] "RemoveContainer" containerID="6f3554d39d020685d0868b6a191ed76faf2b526f6e3fae809ae7af6a7a7f9269" Mar 20 08:52:51 crc kubenswrapper[4903]: I0320 08:52:51.764278 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:52:51 crc kubenswrapper[4903]: E0320 08:52:51.764810 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:53:01 crc kubenswrapper[4903]: I0320 08:53:01.042223 4903 scope.go:117] "RemoveContainer" containerID="dc0246b5033cba2412fe3761cbb94c3af09040b7c4ae5b202375dede12eaaf53" Mar 20 08:53:01 crc kubenswrapper[4903]: I0320 08:53:01.105892 4903 scope.go:117] "RemoveContainer" containerID="54b0f1f5dfa2a405752216bff60fa798790887221bd217f21ee213ca02e2318b" Mar 20 08:53:01 crc kubenswrapper[4903]: I0320 08:53:01.136589 4903 scope.go:117] "RemoveContainer" containerID="5ec65c1f6995adbacb8d006dd9be17bea8e020a50d125daca97eaf0e3f044f8b" Mar 20 08:53:01 crc kubenswrapper[4903]: I0320 08:53:01.167828 4903 scope.go:117] "RemoveContainer" containerID="10f2b77e4d99df665d6349b311dc9b4cfe076067c636391f3d7e6e34202c3750" Mar 20 08:53:01 crc kubenswrapper[4903]: I0320 08:53:01.213850 4903 scope.go:117] "RemoveContainer" containerID="18781c07609ad0d94ec5f46ebf8058362d29f12af1242feb7d305ab16b0765b6" Mar 20 08:53:01 crc kubenswrapper[4903]: I0320 08:53:01.234470 4903 scope.go:117] "RemoveContainer" containerID="7c33a4aacf7c993156b0df84419ae09f7daa2d31a1f364e7cd90f2b85802e23c" Mar 20 08:53:01 crc kubenswrapper[4903]: I0320 08:53:01.265476 4903 scope.go:117] "RemoveContainer" containerID="ad0a1b89ae2c9dff077b2a87669be8cab9c456620225b30cbba84adc397814f5" Mar 20 08:53:01 crc kubenswrapper[4903]: I0320 08:53:01.293878 4903 scope.go:117] "RemoveContainer" containerID="5a72ab8d51aaf45e6540e7fe5f23555acdf41834cf991b22ab1ef61e697dffa1" Mar 20 08:53:01 crc kubenswrapper[4903]: I0320 08:53:01.312652 4903 scope.go:117] "RemoveContainer" containerID="3fd97498ef71fcafedc36c00ac0f24cd90f9cbaeb360b1a6cbd7d84155f621c6" Mar 20 08:53:01 crc kubenswrapper[4903]: I0320 08:53:01.334072 4903 scope.go:117] "RemoveContainer" containerID="228bbc6c0201c5b8f848a584ece52afe6a644153fe624d7108fce309a34c9341" Mar 20 08:53:02 crc kubenswrapper[4903]: I0320 08:53:02.492182 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:53:02 crc kubenswrapper[4903]: E0320 08:53:02.492616 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:53:14 crc kubenswrapper[4903]: I0320 08:53:14.490889 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:53:14 crc kubenswrapper[4903]: E0320 08:53:14.491940 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:53:29 crc kubenswrapper[4903]: I0320 08:53:29.491396 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:53:29 crc kubenswrapper[4903]: E0320 08:53:29.492520 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:53:40 crc kubenswrapper[4903]: I0320 08:53:40.492026 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:53:40 crc kubenswrapper[4903]: E0320 08:53:40.493301 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:53:52 crc kubenswrapper[4903]: I0320 08:53:52.491410 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:53:52 crc kubenswrapper[4903]: E0320 08:53:52.495852 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.168760 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566614-jn9hf"] Mar 20 08:54:00 crc kubenswrapper[4903]: E0320 08:54:00.170044 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f5c42c-0fe9-4880-9d58-a683d748b424" containerName="oc" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.170108 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f5c42c-0fe9-4880-9d58-a683d748b424" containerName="oc" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.170488 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f5c42c-0fe9-4880-9d58-a683d748b424" containerName="oc" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.171475 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566614-jn9hf" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.175336 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.177079 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.177257 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.185359 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566614-jn9hf"] Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.271902 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sj7r\" (UniqueName: \"kubernetes.io/projected/4d84f4c2-e78c-47fa-b054-324606a19025-kube-api-access-5sj7r\") pod \"auto-csr-approver-29566614-jn9hf\" (UID: \"4d84f4c2-e78c-47fa-b054-324606a19025\") " pod="openshift-infra/auto-csr-approver-29566614-jn9hf" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.373220 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sj7r\" (UniqueName: \"kubernetes.io/projected/4d84f4c2-e78c-47fa-b054-324606a19025-kube-api-access-5sj7r\") pod \"auto-csr-approver-29566614-jn9hf\" (UID: \"4d84f4c2-e78c-47fa-b054-324606a19025\") " pod="openshift-infra/auto-csr-approver-29566614-jn9hf" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.395905 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sj7r\" (UniqueName: \"kubernetes.io/projected/4d84f4c2-e78c-47fa-b054-324606a19025-kube-api-access-5sj7r\") pod \"auto-csr-approver-29566614-jn9hf\" (UID: \"4d84f4c2-e78c-47fa-b054-324606a19025\") " pod="openshift-infra/auto-csr-approver-29566614-jn9hf" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.538903 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566614-jn9hf" Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.822722 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566614-jn9hf"] Mar 20 08:54:00 crc kubenswrapper[4903]: I0320 08:54:00.831615 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 08:54:01 crc kubenswrapper[4903]: I0320 08:54:01.461908 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566614-jn9hf" event={"ID":"4d84f4c2-e78c-47fa-b054-324606a19025","Type":"ContainerStarted","Data":"0e4ed631b0ac6e76e86e298d1baec2f0041913c72318b5e491d208ea53580487"} Mar 20 08:54:01 crc kubenswrapper[4903]: I0320 08:54:01.495603 4903 scope.go:117] "RemoveContainer" containerID="d66dd250aac30527302661a2ae22a91d93dd0d6f8cc19055dd8957b946b6d63d" Mar 20 08:54:01 crc kubenswrapper[4903]: I0320 08:54:01.563208 4903 scope.go:117] "RemoveContainer" containerID="ee90dea62199fe22bb435b923f62eb94b35ac85058152fc088a3cfbcfbe6f805" Mar 20 08:54:02 crc kubenswrapper[4903]: I0320 08:54:02.469779 4903 generic.go:334] "Generic (PLEG): container finished" podID="4d84f4c2-e78c-47fa-b054-324606a19025" containerID="33276b6ec65c303260a6ac3a091a05f25e49d202c493ae9b040cd8f6ae75d66c" exitCode=0 Mar 20 08:54:02 crc kubenswrapper[4903]: I0320 08:54:02.469858 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566614-jn9hf" event={"ID":"4d84f4c2-e78c-47fa-b054-324606a19025","Type":"ContainerDied","Data":"33276b6ec65c303260a6ac3a091a05f25e49d202c493ae9b040cd8f6ae75d66c"} Mar 20 08:54:03 crc kubenswrapper[4903]: I0320 08:54:03.750387 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566614-jn9hf" Mar 20 08:54:03 crc kubenswrapper[4903]: I0320 08:54:03.833221 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sj7r\" (UniqueName: \"kubernetes.io/projected/4d84f4c2-e78c-47fa-b054-324606a19025-kube-api-access-5sj7r\") pod \"4d84f4c2-e78c-47fa-b054-324606a19025\" (UID: \"4d84f4c2-e78c-47fa-b054-324606a19025\") " Mar 20 08:54:03 crc kubenswrapper[4903]: I0320 08:54:03.839447 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d84f4c2-e78c-47fa-b054-324606a19025-kube-api-access-5sj7r" (OuterVolumeSpecName: "kube-api-access-5sj7r") pod "4d84f4c2-e78c-47fa-b054-324606a19025" (UID: "4d84f4c2-e78c-47fa-b054-324606a19025"). InnerVolumeSpecName "kube-api-access-5sj7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:54:03 crc kubenswrapper[4903]: I0320 08:54:03.935619 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sj7r\" (UniqueName: \"kubernetes.io/projected/4d84f4c2-e78c-47fa-b054-324606a19025-kube-api-access-5sj7r\") on node \"crc\" DevicePath \"\"" Mar 20 08:54:04 crc kubenswrapper[4903]: I0320 08:54:04.487518 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566614-jn9hf" event={"ID":"4d84f4c2-e78c-47fa-b054-324606a19025","Type":"ContainerDied","Data":"0e4ed631b0ac6e76e86e298d1baec2f0041913c72318b5e491d208ea53580487"} Mar 20 08:54:04 crc kubenswrapper[4903]: I0320 08:54:04.487564 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e4ed631b0ac6e76e86e298d1baec2f0041913c72318b5e491d208ea53580487" Mar 20 08:54:04 crc kubenswrapper[4903]: I0320 08:54:04.487592 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566614-jn9hf" Mar 20 08:54:04 crc kubenswrapper[4903]: I0320 08:54:04.818377 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566608-982dt"] Mar 20 08:54:04 crc kubenswrapper[4903]: I0320 08:54:04.823959 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566608-982dt"] Mar 20 08:54:05 crc kubenswrapper[4903]: I0320 08:54:05.497566 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:54:05 crc kubenswrapper[4903]: E0320 08:54:05.498267 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:54:05 crc kubenswrapper[4903]: I0320 08:54:05.500828 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403ddea3-182d-4078-ae29-8bf03ce54cb5" path="/var/lib/kubelet/pods/403ddea3-182d-4078-ae29-8bf03ce54cb5/volumes" Mar 20 08:54:18 crc kubenswrapper[4903]: I0320 08:54:18.491222 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:54:18 crc kubenswrapper[4903]: E0320 08:54:18.491817 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:54:33 crc kubenswrapper[4903]: I0320 08:54:33.491133 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:54:33 crc kubenswrapper[4903]: E0320 08:54:33.493967 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:54:44 crc kubenswrapper[4903]: I0320 08:54:44.490937 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:54:44 crc kubenswrapper[4903]: E0320 08:54:44.491614 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:54:57 crc kubenswrapper[4903]: I0320 08:54:57.491322 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:54:57 crc kubenswrapper[4903]: E0320 08:54:57.492977 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.531958 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6lpf9"] Mar 20 08:54:59 crc kubenswrapper[4903]: E0320 08:54:59.532317 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d84f4c2-e78c-47fa-b054-324606a19025" containerName="oc" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.532334 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d84f4c2-e78c-47fa-b054-324606a19025" containerName="oc" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.532512 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d84f4c2-e78c-47fa-b054-324606a19025" containerName="oc" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.533728 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.545695 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lpf9"] Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.579902 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75c6b5b-3812-4cd2-b473-b8e8bdf823f1-utilities\") pod \"community-operators-6lpf9\" (UID: \"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1\") " pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.579977 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbg48\" (UniqueName: \"kubernetes.io/projected/e75c6b5b-3812-4cd2-b473-b8e8bdf823f1-kube-api-access-zbg48\") pod \"community-operators-6lpf9\" (UID: \"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1\") " pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.580008 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75c6b5b-3812-4cd2-b473-b8e8bdf823f1-catalog-content\") pod \"community-operators-6lpf9\" (UID: \"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1\") " pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.682280 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75c6b5b-3812-4cd2-b473-b8e8bdf823f1-utilities\") pod \"community-operators-6lpf9\" (UID: \"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1\") " pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.682364 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbg48\" (UniqueName: \"kubernetes.io/projected/e75c6b5b-3812-4cd2-b473-b8e8bdf823f1-kube-api-access-zbg48\") pod \"community-operators-6lpf9\" (UID: \"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1\") " pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.682403 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75c6b5b-3812-4cd2-b473-b8e8bdf823f1-catalog-content\") pod \"community-operators-6lpf9\" (UID: \"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1\") " pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.683145 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e75c6b5b-3812-4cd2-b473-b8e8bdf823f1-catalog-content\") pod \"community-operators-6lpf9\" (UID: \"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1\") " pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.683165 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e75c6b5b-3812-4cd2-b473-b8e8bdf823f1-utilities\") pod \"community-operators-6lpf9\" (UID: \"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1\") " pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.722122 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbg48\" (UniqueName: \"kubernetes.io/projected/e75c6b5b-3812-4cd2-b473-b8e8bdf823f1-kube-api-access-zbg48\") pod \"community-operators-6lpf9\" (UID: \"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1\") " pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:54:59 crc kubenswrapper[4903]: I0320 08:54:59.873798 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:55:00 crc kubenswrapper[4903]: I0320 08:55:00.387577 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lpf9"] Mar 20 08:55:01 crc kubenswrapper[4903]: I0320 08:55:01.042347 4903 generic.go:334] "Generic (PLEG): container finished" podID="e75c6b5b-3812-4cd2-b473-b8e8bdf823f1" containerID="d322164c74db340016a912b0be864e23746c8d53d1c845e9e764c39345d74ff3" exitCode=0 Mar 20 08:55:01 crc kubenswrapper[4903]: I0320 08:55:01.042523 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lpf9" event={"ID":"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1","Type":"ContainerDied","Data":"d322164c74db340016a912b0be864e23746c8d53d1c845e9e764c39345d74ff3"} Mar 20 08:55:01 crc kubenswrapper[4903]: I0320 08:55:01.042557 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lpf9" event={"ID":"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1","Type":"ContainerStarted","Data":"ada44a4e513ecbd0ea2e92c9d072e997c2ed0613f76f7fb0c5c29e064cb6dcf3"} Mar 20 08:55:01 crc kubenswrapper[4903]: I0320 08:55:01.656841 4903 scope.go:117] "RemoveContainer" containerID="59e15fc31b0ce96747c05d477541c7c6a3fe03d93487ef381bf0b660eb845438" Mar 20 08:55:01 crc kubenswrapper[4903]: I0320 08:55:01.697495 4903 scope.go:117] "RemoveContainer" containerID="b3af3565bac51cdf7cfe196ddf488821fbf4fd69b2581ca4ee2090864d020c97" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.489687 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lskkt"] Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.491763 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.496303 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lskkt"] Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.637254 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-utilities\") pod \"redhat-operators-lskkt\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.637361 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-catalog-content\") pod \"redhat-operators-lskkt\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.637405 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws2xp\" (UniqueName: \"kubernetes.io/projected/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-kube-api-access-ws2xp\") pod \"redhat-operators-lskkt\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.738671 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-catalog-content\") pod \"redhat-operators-lskkt\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.738744 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws2xp\" (UniqueName: \"kubernetes.io/projected/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-kube-api-access-ws2xp\") pod \"redhat-operators-lskkt\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.738830 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-utilities\") pod \"redhat-operators-lskkt\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.739224 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-catalog-content\") pod \"redhat-operators-lskkt\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.740424 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-utilities\") pod \"redhat-operators-lskkt\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.773671 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws2xp\" (UniqueName: \"kubernetes.io/projected/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-kube-api-access-ws2xp\") pod \"redhat-operators-lskkt\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:02 crc kubenswrapper[4903]: I0320 08:55:02.852389 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:03 crc kubenswrapper[4903]: I0320 08:55:03.107224 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lskkt"] Mar 20 08:55:04 crc kubenswrapper[4903]: I0320 08:55:04.079002 4903 generic.go:334] "Generic (PLEG): container finished" podID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerID="e6920ebf0e08d3faad2564673f4b6b160e4b37bcc37d3efffd61b1b085c4c4d9" exitCode=0 Mar 20 08:55:04 crc kubenswrapper[4903]: I0320 08:55:04.079083 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskkt" event={"ID":"9c4f0588-a076-4e2a-bf13-3b4fc184c12c","Type":"ContainerDied","Data":"e6920ebf0e08d3faad2564673f4b6b160e4b37bcc37d3efffd61b1b085c4c4d9"} Mar 20 08:55:04 crc kubenswrapper[4903]: I0320 08:55:04.079114 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskkt" event={"ID":"9c4f0588-a076-4e2a-bf13-3b4fc184c12c","Type":"ContainerStarted","Data":"f841b6814a7da0085186dff194c1e5eae56cb520a380dee7a22a60283276e9e7"} Mar 20 08:55:07 crc kubenswrapper[4903]: I0320 08:55:07.105931 4903 generic.go:334] "Generic (PLEG): container finished" podID="e75c6b5b-3812-4cd2-b473-b8e8bdf823f1" containerID="624b2544fbe250ca17cb3742686d81dee79bd084838b100cac2a816a83f41cd1" exitCode=0 Mar 20 08:55:07 crc kubenswrapper[4903]: I0320 08:55:07.106082 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lpf9" event={"ID":"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1","Type":"ContainerDied","Data":"624b2544fbe250ca17cb3742686d81dee79bd084838b100cac2a816a83f41cd1"} Mar 20 08:55:07 crc kubenswrapper[4903]: I0320 08:55:07.109572 4903 generic.go:334] "Generic (PLEG): container finished" podID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerID="cf66780dcc5ee1298a102d5c5cdefe00fc506c20840f9a488de6260f3666d842" exitCode=0 Mar 20 08:55:07 crc kubenswrapper[4903]: I0320 08:55:07.109615 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskkt" event={"ID":"9c4f0588-a076-4e2a-bf13-3b4fc184c12c","Type":"ContainerDied","Data":"cf66780dcc5ee1298a102d5c5cdefe00fc506c20840f9a488de6260f3666d842"} Mar 20 08:55:08 crc kubenswrapper[4903]: I0320 08:55:08.122132 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lpf9" event={"ID":"e75c6b5b-3812-4cd2-b473-b8e8bdf823f1","Type":"ContainerStarted","Data":"04e63b6356a6c6ebf2ec1ceeb8cc2b9291031906b2cb7741e9d90793e1c12372"} Mar 20 08:55:08 crc kubenswrapper[4903]: I0320 08:55:08.124770 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskkt" event={"ID":"9c4f0588-a076-4e2a-bf13-3b4fc184c12c","Type":"ContainerStarted","Data":"bf908a715de8b317194e9ad9bc54cdc4460e5a08c781e0f9bf5620070101487c"} Mar 20 08:55:08 crc kubenswrapper[4903]: I0320 08:55:08.152994 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6lpf9" podStartSLOduration=2.664931099 podStartE2EDuration="9.152975796s" podCreationTimestamp="2026-03-20 08:54:59 +0000 UTC" firstStartedPulling="2026-03-20 08:55:01.045553749 +0000 UTC m=+1926.262454074" lastFinishedPulling="2026-03-20 08:55:07.533598446 +0000 UTC m=+1932.750498771" observedRunningTime="2026-03-20 08:55:08.149678928 +0000 UTC m=+1933.366579243" watchObservedRunningTime="2026-03-20 08:55:08.152975796 +0000 UTC m=+1933.369876111" Mar 20 08:55:08 crc kubenswrapper[4903]: I0320 08:55:08.182555 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lskkt" podStartSLOduration=2.691437516 podStartE2EDuration="6.182539111s" podCreationTimestamp="2026-03-20 08:55:02 +0000 UTC" firstStartedPulling="2026-03-20 08:55:04.080806164 +0000 UTC m=+1929.297706479" lastFinishedPulling="2026-03-20 08:55:07.571907759 +0000 UTC m=+1932.788808074" observedRunningTime="2026-03-20 08:55:08.180853201 +0000 UTC m=+1933.397753516" watchObservedRunningTime="2026-03-20 08:55:08.182539111 +0000 UTC m=+1933.399439426" Mar 20 08:55:09 crc kubenswrapper[4903]: I0320 08:55:09.491411 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:55:09 crc kubenswrapper[4903]: E0320 08:55:09.491637 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:55:09 crc kubenswrapper[4903]: I0320 08:55:09.874112 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:55:09 crc kubenswrapper[4903]: I0320 08:55:09.874204 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:55:10 crc kubenswrapper[4903]: I0320 08:55:10.922167 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6lpf9" podUID="e75c6b5b-3812-4cd2-b473-b8e8bdf823f1" containerName="registry-server" probeResult="failure" output=< Mar 20 08:55:10 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Mar 20 08:55:10 crc kubenswrapper[4903]: > Mar 20 08:55:12 crc kubenswrapper[4903]: I0320 08:55:12.854163 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:12 crc kubenswrapper[4903]: I0320 08:55:12.854232 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:13 crc kubenswrapper[4903]: I0320 08:55:13.909440 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lskkt" podUID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerName="registry-server" probeResult="failure" output=< Mar 20 08:55:13 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Mar 20 08:55:13 crc kubenswrapper[4903]: > Mar 20 08:55:19 crc kubenswrapper[4903]: I0320 08:55:19.929138 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:55:19 crc kubenswrapper[4903]: I0320 08:55:19.995798 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6lpf9" Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.075192 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lpf9"] Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.170355 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8gfb"] Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.170654 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k8gfb" podUID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerName="registry-server" containerID="cri-o://5e6ea669a181b5c974ce7abdb1fe23b48de5e5dd49b113eb4051f41b105d1a9f" gracePeriod=2 Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.327390 4903 generic.go:334] "Generic (PLEG): container finished" podID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerID="5e6ea669a181b5c974ce7abdb1fe23b48de5e5dd49b113eb4051f41b105d1a9f" exitCode=0 Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.327483 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gfb" event={"ID":"8a16a940-f2b4-470a-a563-4110a9756e4d","Type":"ContainerDied","Data":"5e6ea669a181b5c974ce7abdb1fe23b48de5e5dd49b113eb4051f41b105d1a9f"} Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.572605 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.652233 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-utilities\") pod \"8a16a940-f2b4-470a-a563-4110a9756e4d\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.652295 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rjv2\" (UniqueName: \"kubernetes.io/projected/8a16a940-f2b4-470a-a563-4110a9756e4d-kube-api-access-7rjv2\") pod \"8a16a940-f2b4-470a-a563-4110a9756e4d\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.652375 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-catalog-content\") pod \"8a16a940-f2b4-470a-a563-4110a9756e4d\" (UID: \"8a16a940-f2b4-470a-a563-4110a9756e4d\") " Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.652711 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-utilities" (OuterVolumeSpecName: "utilities") pod "8a16a940-f2b4-470a-a563-4110a9756e4d" (UID: "8a16a940-f2b4-470a-a563-4110a9756e4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.652791 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.676492 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a16a940-f2b4-470a-a563-4110a9756e4d-kube-api-access-7rjv2" (OuterVolumeSpecName: "kube-api-access-7rjv2") pod "8a16a940-f2b4-470a-a563-4110a9756e4d" (UID: "8a16a940-f2b4-470a-a563-4110a9756e4d"). InnerVolumeSpecName "kube-api-access-7rjv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.709155 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a16a940-f2b4-470a-a563-4110a9756e4d" (UID: "8a16a940-f2b4-470a-a563-4110a9756e4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.754107 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a16a940-f2b4-470a-a563-4110a9756e4d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:20 crc kubenswrapper[4903]: I0320 08:55:20.754341 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rjv2\" (UniqueName: \"kubernetes.io/projected/8a16a940-f2b4-470a-a563-4110a9756e4d-kube-api-access-7rjv2\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:21 crc kubenswrapper[4903]: I0320 08:55:21.337255 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8gfb" event={"ID":"8a16a940-f2b4-470a-a563-4110a9756e4d","Type":"ContainerDied","Data":"0829e850c8174c55c14ac010c86c7e7a26d6f46da96daf7025a5e0987e82e47e"} Mar 20 08:55:21 crc kubenswrapper[4903]: I0320 08:55:21.337350 4903 scope.go:117] "RemoveContainer" containerID="5e6ea669a181b5c974ce7abdb1fe23b48de5e5dd49b113eb4051f41b105d1a9f" Mar 20 08:55:21 crc kubenswrapper[4903]: I0320 08:55:21.337283 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8gfb" Mar 20 08:55:21 crc kubenswrapper[4903]: I0320 08:55:21.363656 4903 scope.go:117] "RemoveContainer" containerID="ebe4a63d0ecb63b021cdcab2a2c438fae014d9380b08aa0881190f417cd07e08" Mar 20 08:55:21 crc kubenswrapper[4903]: I0320 08:55:21.369892 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8gfb"] Mar 20 08:55:21 crc kubenswrapper[4903]: I0320 08:55:21.373825 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k8gfb"] Mar 20 08:55:21 crc kubenswrapper[4903]: I0320 08:55:21.389008 4903 scope.go:117] "RemoveContainer" containerID="abbb4b0355fb3a5968a5661a3d82940c2c7b0cecbce5228a496a57c99bd122d1" Mar 20 08:55:21 crc kubenswrapper[4903]: I0320 08:55:21.498513 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a16a940-f2b4-470a-a563-4110a9756e4d" path="/var/lib/kubelet/pods/8a16a940-f2b4-470a-a563-4110a9756e4d/volumes" Mar 20 08:55:22 crc kubenswrapper[4903]: I0320 08:55:22.491999 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:55:22 crc kubenswrapper[4903]: E0320 08:55:22.492599 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:55:22 crc kubenswrapper[4903]: I0320 08:55:22.894943 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:22 crc kubenswrapper[4903]: I0320 08:55:22.941719 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:25 crc kubenswrapper[4903]: I0320 08:55:25.172463 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lskkt"] Mar 20 08:55:25 crc kubenswrapper[4903]: I0320 08:55:25.174481 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lskkt" podUID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerName="registry-server" containerID="cri-o://bf908a715de8b317194e9ad9bc54cdc4460e5a08c781e0f9bf5620070101487c" gracePeriod=2 Mar 20 08:55:25 crc kubenswrapper[4903]: I0320 08:55:25.376935 4903 generic.go:334] "Generic (PLEG): container finished" podID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerID="bf908a715de8b317194e9ad9bc54cdc4460e5a08c781e0f9bf5620070101487c" exitCode=0 Mar 20 08:55:25 crc kubenswrapper[4903]: I0320 08:55:25.376979 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskkt" event={"ID":"9c4f0588-a076-4e2a-bf13-3b4fc184c12c","Type":"ContainerDied","Data":"bf908a715de8b317194e9ad9bc54cdc4460e5a08c781e0f9bf5620070101487c"} Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.122379 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.240809 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-utilities\") pod \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.241327 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws2xp\" (UniqueName: \"kubernetes.io/projected/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-kube-api-access-ws2xp\") pod \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.241580 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-catalog-content\") pod \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\" (UID: \"9c4f0588-a076-4e2a-bf13-3b4fc184c12c\") " Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.246026 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-utilities" (OuterVolumeSpecName: "utilities") pod "9c4f0588-a076-4e2a-bf13-3b4fc184c12c" (UID: "9c4f0588-a076-4e2a-bf13-3b4fc184c12c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.261206 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-kube-api-access-ws2xp" (OuterVolumeSpecName: "kube-api-access-ws2xp") pod "9c4f0588-a076-4e2a-bf13-3b4fc184c12c" (UID: "9c4f0588-a076-4e2a-bf13-3b4fc184c12c"). InnerVolumeSpecName "kube-api-access-ws2xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.344245 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.344290 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws2xp\" (UniqueName: \"kubernetes.io/projected/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-kube-api-access-ws2xp\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.386696 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lskkt" event={"ID":"9c4f0588-a076-4e2a-bf13-3b4fc184c12c","Type":"ContainerDied","Data":"f841b6814a7da0085186dff194c1e5eae56cb520a380dee7a22a60283276e9e7"} Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.386775 4903 scope.go:117] "RemoveContainer" containerID="bf908a715de8b317194e9ad9bc54cdc4460e5a08c781e0f9bf5620070101487c" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.386914 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lskkt" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.396390 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c4f0588-a076-4e2a-bf13-3b4fc184c12c" (UID: "9c4f0588-a076-4e2a-bf13-3b4fc184c12c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.410528 4903 scope.go:117] "RemoveContainer" containerID="cf66780dcc5ee1298a102d5c5cdefe00fc506c20840f9a488de6260f3666d842" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.435817 4903 scope.go:117] "RemoveContainer" containerID="e6920ebf0e08d3faad2564673f4b6b160e4b37bcc37d3efffd61b1b085c4c4d9" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.446059 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c4f0588-a076-4e2a-bf13-3b4fc184c12c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.718435 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lskkt"] Mar 20 08:55:26 crc kubenswrapper[4903]: I0320 08:55:26.730219 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lskkt"] Mar 20 08:55:27 crc kubenswrapper[4903]: I0320 08:55:27.505270 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" path="/var/lib/kubelet/pods/9c4f0588-a076-4e2a-bf13-3b4fc184c12c/volumes" Mar 20 08:55:36 crc kubenswrapper[4903]: I0320 08:55:36.490775 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:55:36 crc kubenswrapper[4903]: E0320 08:55:36.491634 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:55:49 crc kubenswrapper[4903]: I0320 08:55:49.490879 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:55:49 crc kubenswrapper[4903]: E0320 08:55:49.492780 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.153821 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566616-9b65m"] Mar 20 08:56:00 crc kubenswrapper[4903]: E0320 08:56:00.154929 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerName="extract-utilities" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.154953 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerName="extract-utilities" Mar 20 08:56:00 crc kubenswrapper[4903]: E0320 08:56:00.154969 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerName="extract-content" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.154981 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerName="extract-content" Mar 20 08:56:00 crc kubenswrapper[4903]: E0320 08:56:00.155009 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerName="registry-server" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.155020 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerName="registry-server" Mar 20 08:56:00 crc kubenswrapper[4903]: E0320 08:56:00.155072 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerName="extract-content" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.155084 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerName="extract-content" Mar 20 08:56:00 crc kubenswrapper[4903]: E0320 08:56:00.155113 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerName="extract-utilities" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.155127 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerName="extract-utilities" Mar 20 08:56:00 crc kubenswrapper[4903]: E0320 08:56:00.155156 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerName="registry-server" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.155170 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerName="registry-server" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.155398 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a16a940-f2b4-470a-a563-4110a9756e4d" containerName="registry-server" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.155419 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4f0588-a076-4e2a-bf13-3b4fc184c12c" containerName="registry-server" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.156321 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566616-9b65m" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.159081 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.159021 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.160994 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566616-9b65m"] Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.161460 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.303668 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4btr\" (UniqueName: \"kubernetes.io/projected/83d46482-f640-42c9-b3a5-19836260cfd7-kube-api-access-b4btr\") pod \"auto-csr-approver-29566616-9b65m\" (UID: \"83d46482-f640-42c9-b3a5-19836260cfd7\") " pod="openshift-infra/auto-csr-approver-29566616-9b65m" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.405223 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4btr\" (UniqueName: \"kubernetes.io/projected/83d46482-f640-42c9-b3a5-19836260cfd7-kube-api-access-b4btr\") pod \"auto-csr-approver-29566616-9b65m\" (UID: \"83d46482-f640-42c9-b3a5-19836260cfd7\") " pod="openshift-infra/auto-csr-approver-29566616-9b65m" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.433509 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4btr\" (UniqueName: \"kubernetes.io/projected/83d46482-f640-42c9-b3a5-19836260cfd7-kube-api-access-b4btr\") pod \"auto-csr-approver-29566616-9b65m\" (UID: \"83d46482-f640-42c9-b3a5-19836260cfd7\") " pod="openshift-infra/auto-csr-approver-29566616-9b65m" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.490138 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566616-9b65m" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.490314 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:56:00 crc kubenswrapper[4903]: E0320 08:56:00.490569 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:56:00 crc kubenswrapper[4903]: I0320 08:56:00.932283 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566616-9b65m"] Mar 20 08:56:01 crc kubenswrapper[4903]: I0320 08:56:01.723738 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566616-9b65m" event={"ID":"83d46482-f640-42c9-b3a5-19836260cfd7","Type":"ContainerStarted","Data":"f02f75f015b11ae5873b8a8f88601a39207eedb132ae824bf45acf70e826c52f"} Mar 20 08:56:02 crc kubenswrapper[4903]: I0320 08:56:02.732768 4903 generic.go:334] "Generic (PLEG): container finished" podID="83d46482-f640-42c9-b3a5-19836260cfd7" containerID="a876f92a06bbb4fd38107456e393a57f955a8e14b570d6905a4f4fd7d947e72c" exitCode=0 Mar 20 08:56:02 crc kubenswrapper[4903]: I0320 08:56:02.732876 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566616-9b65m" event={"ID":"83d46482-f640-42c9-b3a5-19836260cfd7","Type":"ContainerDied","Data":"a876f92a06bbb4fd38107456e393a57f955a8e14b570d6905a4f4fd7d947e72c"} Mar 20 08:56:04 crc kubenswrapper[4903]: I0320 08:56:04.128993 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566616-9b65m" Mar 20 08:56:04 crc kubenswrapper[4903]: I0320 08:56:04.274244 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4btr\" (UniqueName: \"kubernetes.io/projected/83d46482-f640-42c9-b3a5-19836260cfd7-kube-api-access-b4btr\") pod \"83d46482-f640-42c9-b3a5-19836260cfd7\" (UID: \"83d46482-f640-42c9-b3a5-19836260cfd7\") " Mar 20 08:56:04 crc kubenswrapper[4903]: I0320 08:56:04.283252 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83d46482-f640-42c9-b3a5-19836260cfd7-kube-api-access-b4btr" (OuterVolumeSpecName: "kube-api-access-b4btr") pod "83d46482-f640-42c9-b3a5-19836260cfd7" (UID: "83d46482-f640-42c9-b3a5-19836260cfd7"). InnerVolumeSpecName "kube-api-access-b4btr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:56:04 crc kubenswrapper[4903]: I0320 08:56:04.376618 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4btr\" (UniqueName: \"kubernetes.io/projected/83d46482-f640-42c9-b3a5-19836260cfd7-kube-api-access-b4btr\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:04 crc kubenswrapper[4903]: I0320 08:56:04.754908 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566616-9b65m" event={"ID":"83d46482-f640-42c9-b3a5-19836260cfd7","Type":"ContainerDied","Data":"f02f75f015b11ae5873b8a8f88601a39207eedb132ae824bf45acf70e826c52f"} Mar 20 08:56:04 crc kubenswrapper[4903]: I0320 08:56:04.754969 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566616-9b65m" Mar 20 08:56:04 crc kubenswrapper[4903]: I0320 08:56:04.754962 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f02f75f015b11ae5873b8a8f88601a39207eedb132ae824bf45acf70e826c52f" Mar 20 08:56:05 crc kubenswrapper[4903]: I0320 08:56:05.214803 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566610-lhr5n"] Mar 20 08:56:05 crc kubenswrapper[4903]: I0320 08:56:05.226913 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566610-lhr5n"] Mar 20 08:56:05 crc kubenswrapper[4903]: I0320 08:56:05.506762 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d6c81a-c1cb-48a7-95a1-a957c6a0fbea" path="/var/lib/kubelet/pods/46d6c81a-c1cb-48a7-95a1-a957c6a0fbea/volumes" Mar 20 08:56:12 crc kubenswrapper[4903]: I0320 08:56:12.491382 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:56:12 crc kubenswrapper[4903]: E0320 08:56:12.492240 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.100188 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mq9v5"] Mar 20 08:56:23 crc kubenswrapper[4903]: E0320 08:56:23.103517 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83d46482-f640-42c9-b3a5-19836260cfd7" containerName="oc" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.103794 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="83d46482-f640-42c9-b3a5-19836260cfd7" containerName="oc" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.104159 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="83d46482-f640-42c9-b3a5-19836260cfd7" containerName="oc" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.105571 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.119174 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mq9v5"] Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.296591 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-catalog-content\") pod \"certified-operators-mq9v5\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.296664 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2wtg\" (UniqueName: \"kubernetes.io/projected/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-kube-api-access-w2wtg\") pod \"certified-operators-mq9v5\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.296725 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-utilities\") pod \"certified-operators-mq9v5\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.398390 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-catalog-content\") pod \"certified-operators-mq9v5\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.398533 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2wtg\" (UniqueName: \"kubernetes.io/projected/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-kube-api-access-w2wtg\") pod \"certified-operators-mq9v5\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.398647 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-utilities\") pod \"certified-operators-mq9v5\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.399281 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-utilities\") pod \"certified-operators-mq9v5\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.399780 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-catalog-content\") pod \"certified-operators-mq9v5\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.423179 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2wtg\" (UniqueName: \"kubernetes.io/projected/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-kube-api-access-w2wtg\") pod \"certified-operators-mq9v5\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.437469 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.495205 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:56:23 crc kubenswrapper[4903]: E0320 08:56:23.495487 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.676727 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mq9v5"] Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.933829 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9v5" event={"ID":"f39f9ae5-61e2-497a-9df0-341a0c2c9f67","Type":"ContainerStarted","Data":"5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93"} Mar 20 08:56:23 crc kubenswrapper[4903]: I0320 08:56:23.934159 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9v5" event={"ID":"f39f9ae5-61e2-497a-9df0-341a0c2c9f67","Type":"ContainerStarted","Data":"2fb63c98393fd8a93c153953239a94fe46d46647f2e6933d39df7247bb22f6a8"} Mar 20 08:56:24 crc kubenswrapper[4903]: I0320 08:56:24.949486 4903 generic.go:334] "Generic (PLEG): container finished" podID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerID="5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93" exitCode=0 Mar 20 08:56:24 crc kubenswrapper[4903]: I0320 08:56:24.949564 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9v5" event={"ID":"f39f9ae5-61e2-497a-9df0-341a0c2c9f67","Type":"ContainerDied","Data":"5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93"} Mar 20 08:56:26 crc kubenswrapper[4903]: I0320 08:56:26.971138 4903 generic.go:334] "Generic (PLEG): container finished" podID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerID="ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b" exitCode=0 Mar 20 08:56:26 crc kubenswrapper[4903]: I0320 08:56:26.971245 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9v5" event={"ID":"f39f9ae5-61e2-497a-9df0-341a0c2c9f67","Type":"ContainerDied","Data":"ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b"} Mar 20 08:56:27 crc kubenswrapper[4903]: I0320 08:56:27.981094 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9v5" event={"ID":"f39f9ae5-61e2-497a-9df0-341a0c2c9f67","Type":"ContainerStarted","Data":"a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95"} Mar 20 08:56:28 crc kubenswrapper[4903]: I0320 08:56:28.008588 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mq9v5" podStartSLOduration=2.413135395 podStartE2EDuration="5.008567082s" podCreationTimestamp="2026-03-20 08:56:23 +0000 UTC" firstStartedPulling="2026-03-20 08:56:24.953408144 +0000 UTC m=+2010.170308489" lastFinishedPulling="2026-03-20 08:56:27.548839811 +0000 UTC m=+2012.765740176" observedRunningTime="2026-03-20 08:56:28.004490743 +0000 UTC m=+2013.221391078" watchObservedRunningTime="2026-03-20 08:56:28.008567082 +0000 UTC m=+2013.225467397" Mar 20 08:56:33 crc kubenswrapper[4903]: I0320 08:56:33.438101 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:33 crc kubenswrapper[4903]: I0320 08:56:33.438591 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:33 crc kubenswrapper[4903]: I0320 08:56:33.512862 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:34 crc kubenswrapper[4903]: I0320 08:56:34.125757 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:34 crc kubenswrapper[4903]: I0320 08:56:34.202228 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mq9v5"] Mar 20 08:56:34 crc kubenswrapper[4903]: I0320 08:56:34.490856 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:56:34 crc kubenswrapper[4903]: E0320 08:56:34.491498 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.069877 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mq9v5" podUID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerName="registry-server" containerID="cri-o://a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95" gracePeriod=2 Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.614265 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.721597 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2wtg\" (UniqueName: \"kubernetes.io/projected/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-kube-api-access-w2wtg\") pod \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.721979 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-utilities\") pod \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.722061 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-catalog-content\") pod \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\" (UID: \"f39f9ae5-61e2-497a-9df0-341a0c2c9f67\") " Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.725435 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-utilities" (OuterVolumeSpecName: "utilities") pod "f39f9ae5-61e2-497a-9df0-341a0c2c9f67" (UID: "f39f9ae5-61e2-497a-9df0-341a0c2c9f67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.738998 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-kube-api-access-w2wtg" (OuterVolumeSpecName: "kube-api-access-w2wtg") pod "f39f9ae5-61e2-497a-9df0-341a0c2c9f67" (UID: "f39f9ae5-61e2-497a-9df0-341a0c2c9f67"). InnerVolumeSpecName "kube-api-access-w2wtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.819164 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f39f9ae5-61e2-497a-9df0-341a0c2c9f67" (UID: "f39f9ae5-61e2-497a-9df0-341a0c2c9f67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.823861 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2wtg\" (UniqueName: \"kubernetes.io/projected/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-kube-api-access-w2wtg\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.823910 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:36 crc kubenswrapper[4903]: I0320 08:56:36.823929 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f39f9ae5-61e2-497a-9df0-341a0c2c9f67-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.085470 4903 generic.go:334] "Generic (PLEG): container finished" podID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerID="a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95" exitCode=0 Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.085537 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mq9v5" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.085557 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9v5" event={"ID":"f39f9ae5-61e2-497a-9df0-341a0c2c9f67","Type":"ContainerDied","Data":"a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95"} Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.086529 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mq9v5" event={"ID":"f39f9ae5-61e2-497a-9df0-341a0c2c9f67","Type":"ContainerDied","Data":"2fb63c98393fd8a93c153953239a94fe46d46647f2e6933d39df7247bb22f6a8"} Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.086562 4903 scope.go:117] "RemoveContainer" containerID="a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.118363 4903 scope.go:117] "RemoveContainer" containerID="ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.157849 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mq9v5"] Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.167959 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mq9v5"] Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.180370 4903 scope.go:117] "RemoveContainer" containerID="5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.203328 4903 scope.go:117] "RemoveContainer" containerID="a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95" Mar 20 08:56:37 crc kubenswrapper[4903]: E0320 08:56:37.203978 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95\": container with ID starting with a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95 not found: ID does not exist" containerID="a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.204078 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95"} err="failed to get container status \"a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95\": rpc error: code = NotFound desc = could not find container \"a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95\": container with ID starting with a49fdea9cb3e9e59d0cbd93db7e74a7417f1a99fca056f6acff25f1391173b95 not found: ID does not exist" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.204124 4903 scope.go:117] "RemoveContainer" containerID="ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b" Mar 20 08:56:37 crc kubenswrapper[4903]: E0320 08:56:37.206713 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b\": container with ID starting with ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b not found: ID does not exist" containerID="ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.206772 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b"} err="failed to get container status \"ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b\": rpc error: code = NotFound desc = could not find container \"ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b\": container with ID starting with ead197b507f65e5fec6b00d62140edab291e4b497602c5b1f39be40f11a2991b not found: ID does not exist" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.206813 4903 scope.go:117] "RemoveContainer" containerID="5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93" Mar 20 08:56:37 crc kubenswrapper[4903]: E0320 08:56:37.207311 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93\": container with ID starting with 5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93 not found: ID does not exist" containerID="5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.207361 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93"} err="failed to get container status \"5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93\": rpc error: code = NotFound desc = could not find container \"5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93\": container with ID starting with 5412f461bf3dbd667eeda5150ef416702d6fd942aa6a5e3a18b6f1d514cf2d93 not found: ID does not exist" Mar 20 08:56:37 crc kubenswrapper[4903]: I0320 08:56:37.505691 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" path="/var/lib/kubelet/pods/f39f9ae5-61e2-497a-9df0-341a0c2c9f67/volumes" Mar 20 08:56:47 crc kubenswrapper[4903]: I0320 08:56:47.491728 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:56:47 crc kubenswrapper[4903]: E0320 08:56:47.492821 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:57:01 crc kubenswrapper[4903]: I0320 08:57:01.851496 4903 scope.go:117] "RemoveContainer" containerID="4c59dac9b62c965032f39e120b558d55d2f4e5a101d1aaa889af3eacbae8630e" Mar 20 08:57:02 crc kubenswrapper[4903]: I0320 08:57:02.490705 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:57:02 crc kubenswrapper[4903]: E0320 08:57:02.491225 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:57:15 crc kubenswrapper[4903]: I0320 08:57:15.499666 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:57:15 crc kubenswrapper[4903]: E0320 08:57:15.500642 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:57:26 crc kubenswrapper[4903]: I0320 08:57:26.492439 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:57:26 crc kubenswrapper[4903]: E0320 08:57:26.493656 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:57:40 crc kubenswrapper[4903]: I0320 08:57:40.491224 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:57:40 crc kubenswrapper[4903]: E0320 08:57:40.492104 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 08:57:55 crc kubenswrapper[4903]: I0320 08:57:55.502056 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 08:57:55 crc kubenswrapper[4903]: I0320 08:57:55.866538 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"f6e6d4b852a87fa5948de40052e25925d5cd0b146939ba40c279384bb383c9f7"} Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.156214 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566618-n2zwf"] Mar 20 08:58:00 crc kubenswrapper[4903]: E0320 08:58:00.157089 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerName="extract-content" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.157110 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerName="extract-content" Mar 20 08:58:00 crc kubenswrapper[4903]: E0320 08:58:00.157124 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerName="registry-server" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.157134 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerName="registry-server" Mar 20 08:58:00 crc kubenswrapper[4903]: E0320 08:58:00.157153 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerName="extract-utilities" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.157166 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerName="extract-utilities" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.157396 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f39f9ae5-61e2-497a-9df0-341a0c2c9f67" containerName="registry-server" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.157944 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566618-n2zwf" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.159314 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.159999 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.167647 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.174215 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566618-n2zwf"] Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.204403 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9b2n\" (UniqueName: \"kubernetes.io/projected/11f1777a-fc60-4e30-8eca-fff1aa4776b0-kube-api-access-h9b2n\") pod \"auto-csr-approver-29566618-n2zwf\" (UID: \"11f1777a-fc60-4e30-8eca-fff1aa4776b0\") " pod="openshift-infra/auto-csr-approver-29566618-n2zwf" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.305698 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9b2n\" (UniqueName: \"kubernetes.io/projected/11f1777a-fc60-4e30-8eca-fff1aa4776b0-kube-api-access-h9b2n\") pod \"auto-csr-approver-29566618-n2zwf\" (UID: \"11f1777a-fc60-4e30-8eca-fff1aa4776b0\") " pod="openshift-infra/auto-csr-approver-29566618-n2zwf" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.328842 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9b2n\" (UniqueName: \"kubernetes.io/projected/11f1777a-fc60-4e30-8eca-fff1aa4776b0-kube-api-access-h9b2n\") pod \"auto-csr-approver-29566618-n2zwf\" (UID: \"11f1777a-fc60-4e30-8eca-fff1aa4776b0\") " pod="openshift-infra/auto-csr-approver-29566618-n2zwf" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.477695 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566618-n2zwf" Mar 20 08:58:00 crc kubenswrapper[4903]: I0320 08:58:00.964667 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566618-n2zwf"] Mar 20 08:58:00 crc kubenswrapper[4903]: W0320 08:58:00.968353 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11f1777a_fc60_4e30_8eca_fff1aa4776b0.slice/crio-e0dcb8c48cd131b96785fc6c036d104dfeda8f5d7be13187972cadfa9e3e150d WatchSource:0}: Error finding container e0dcb8c48cd131b96785fc6c036d104dfeda8f5d7be13187972cadfa9e3e150d: Status 404 returned error can't find the container with id e0dcb8c48cd131b96785fc6c036d104dfeda8f5d7be13187972cadfa9e3e150d Mar 20 08:58:01 crc kubenswrapper[4903]: I0320 08:58:01.928173 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566618-n2zwf" event={"ID":"11f1777a-fc60-4e30-8eca-fff1aa4776b0","Type":"ContainerStarted","Data":"e0dcb8c48cd131b96785fc6c036d104dfeda8f5d7be13187972cadfa9e3e150d"} Mar 20 08:58:02 crc kubenswrapper[4903]: I0320 08:58:02.944527 4903 generic.go:334] "Generic (PLEG): container finished" podID="11f1777a-fc60-4e30-8eca-fff1aa4776b0" containerID="a95ac29f20207be00f5e9608ad1855f7e8d25dc8c89bbb819dd71adc02e2596b" exitCode=0 Mar 20 08:58:02 crc kubenswrapper[4903]: I0320 08:58:02.944608 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566618-n2zwf" event={"ID":"11f1777a-fc60-4e30-8eca-fff1aa4776b0","Type":"ContainerDied","Data":"a95ac29f20207be00f5e9608ad1855f7e8d25dc8c89bbb819dd71adc02e2596b"} Mar 20 08:58:04 crc kubenswrapper[4903]: I0320 08:58:04.354153 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566618-n2zwf" Mar 20 08:58:04 crc kubenswrapper[4903]: I0320 08:58:04.481491 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9b2n\" (UniqueName: \"kubernetes.io/projected/11f1777a-fc60-4e30-8eca-fff1aa4776b0-kube-api-access-h9b2n\") pod \"11f1777a-fc60-4e30-8eca-fff1aa4776b0\" (UID: \"11f1777a-fc60-4e30-8eca-fff1aa4776b0\") " Mar 20 08:58:04 crc kubenswrapper[4903]: I0320 08:58:04.502659 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f1777a-fc60-4e30-8eca-fff1aa4776b0-kube-api-access-h9b2n" (OuterVolumeSpecName: "kube-api-access-h9b2n") pod "11f1777a-fc60-4e30-8eca-fff1aa4776b0" (UID: "11f1777a-fc60-4e30-8eca-fff1aa4776b0"). InnerVolumeSpecName "kube-api-access-h9b2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 08:58:04 crc kubenswrapper[4903]: I0320 08:58:04.583457 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9b2n\" (UniqueName: \"kubernetes.io/projected/11f1777a-fc60-4e30-8eca-fff1aa4776b0-kube-api-access-h9b2n\") on node \"crc\" DevicePath \"\"" Mar 20 08:58:04 crc kubenswrapper[4903]: I0320 08:58:04.965064 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566618-n2zwf" event={"ID":"11f1777a-fc60-4e30-8eca-fff1aa4776b0","Type":"ContainerDied","Data":"e0dcb8c48cd131b96785fc6c036d104dfeda8f5d7be13187972cadfa9e3e150d"} Mar 20 08:58:04 crc kubenswrapper[4903]: I0320 08:58:04.965358 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0dcb8c48cd131b96785fc6c036d104dfeda8f5d7be13187972cadfa9e3e150d" Mar 20 08:58:04 crc kubenswrapper[4903]: I0320 08:58:04.965151 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566618-n2zwf" Mar 20 08:58:05 crc kubenswrapper[4903]: I0320 08:58:05.442560 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566612-jw6xc"] Mar 20 08:58:05 crc kubenswrapper[4903]: I0320 08:58:05.452235 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566612-jw6xc"] Mar 20 08:58:05 crc kubenswrapper[4903]: I0320 08:58:05.505369 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f5c42c-0fe9-4880-9d58-a683d748b424" path="/var/lib/kubelet/pods/82f5c42c-0fe9-4880-9d58-a683d748b424/volumes" Mar 20 08:59:01 crc kubenswrapper[4903]: I0320 08:59:01.962624 4903 scope.go:117] "RemoveContainer" containerID="b868cda2fbba12d39d2c4a4c30d9c08efe46ce6fb9af3c55eeafad8f7cac2afd" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.149517 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566620-l94tv"] Mar 20 09:00:00 crc kubenswrapper[4903]: E0320 09:00:00.150949 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11f1777a-fc60-4e30-8eca-fff1aa4776b0" containerName="oc" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.150979 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="11f1777a-fc60-4e30-8eca-fff1aa4776b0" containerName="oc" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.151360 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="11f1777a-fc60-4e30-8eca-fff1aa4776b0" containerName="oc" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.152277 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566620-l94tv" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.154865 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.154920 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.155099 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.159114 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp"] Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.160410 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.162921 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.163738 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.166016 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566620-l94tv"] Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.185595 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp"] Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.233297 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/355433c0-97b4-468e-bf51-d9fff62ff05f-secret-volume\") pod \"collect-profiles-29566620-mgsqp\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.233334 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/355433c0-97b4-468e-bf51-d9fff62ff05f-config-volume\") pod \"collect-profiles-29566620-mgsqp\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.233362 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwjfr\" (UniqueName: \"kubernetes.io/projected/72c92018-7fab-4cfa-a34f-9f6fb2dce3f5-kube-api-access-wwjfr\") pod \"auto-csr-approver-29566620-l94tv\" (UID: \"72c92018-7fab-4cfa-a34f-9f6fb2dce3f5\") " pod="openshift-infra/auto-csr-approver-29566620-l94tv" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.233609 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rhp4\" (UniqueName: \"kubernetes.io/projected/355433c0-97b4-468e-bf51-d9fff62ff05f-kube-api-access-4rhp4\") pod \"collect-profiles-29566620-mgsqp\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.335471 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rhp4\" (UniqueName: \"kubernetes.io/projected/355433c0-97b4-468e-bf51-d9fff62ff05f-kube-api-access-4rhp4\") pod \"collect-profiles-29566620-mgsqp\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.335612 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/355433c0-97b4-468e-bf51-d9fff62ff05f-secret-volume\") pod \"collect-profiles-29566620-mgsqp\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.335667 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/355433c0-97b4-468e-bf51-d9fff62ff05f-config-volume\") pod \"collect-profiles-29566620-mgsqp\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.335824 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwjfr\" (UniqueName: \"kubernetes.io/projected/72c92018-7fab-4cfa-a34f-9f6fb2dce3f5-kube-api-access-wwjfr\") pod \"auto-csr-approver-29566620-l94tv\" (UID: \"72c92018-7fab-4cfa-a34f-9f6fb2dce3f5\") " pod="openshift-infra/auto-csr-approver-29566620-l94tv" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.336778 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/355433c0-97b4-468e-bf51-d9fff62ff05f-config-volume\") pod \"collect-profiles-29566620-mgsqp\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.351951 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/355433c0-97b4-468e-bf51-d9fff62ff05f-secret-volume\") pod \"collect-profiles-29566620-mgsqp\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.353092 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwjfr\" (UniqueName: \"kubernetes.io/projected/72c92018-7fab-4cfa-a34f-9f6fb2dce3f5-kube-api-access-wwjfr\") pod \"auto-csr-approver-29566620-l94tv\" (UID: \"72c92018-7fab-4cfa-a34f-9f6fb2dce3f5\") " pod="openshift-infra/auto-csr-approver-29566620-l94tv" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.354011 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rhp4\" (UniqueName: \"kubernetes.io/projected/355433c0-97b4-468e-bf51-d9fff62ff05f-kube-api-access-4rhp4\") pod \"collect-profiles-29566620-mgsqp\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.472968 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566620-l94tv" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.493470 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.925783 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566620-l94tv"] Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.937708 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:00:00 crc kubenswrapper[4903]: W0320 09:00:00.981325 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod355433c0_97b4_468e_bf51_d9fff62ff05f.slice/crio-495d5086b4dddb7add13a2d585b493609b36c7d899f7657311e7376de35241f6 WatchSource:0}: Error finding container 495d5086b4dddb7add13a2d585b493609b36c7d899f7657311e7376de35241f6: Status 404 returned error can't find the container with id 495d5086b4dddb7add13a2d585b493609b36c7d899f7657311e7376de35241f6 Mar 20 09:00:00 crc kubenswrapper[4903]: I0320 09:00:00.981901 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp"] Mar 20 09:00:01 crc kubenswrapper[4903]: I0320 09:00:01.310954 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" event={"ID":"355433c0-97b4-468e-bf51-d9fff62ff05f","Type":"ContainerStarted","Data":"55f449b073c648c8945e7a5d53e5ab553193695687efd04e7931fc6dbecd91f2"} Mar 20 09:00:01 crc kubenswrapper[4903]: I0320 09:00:01.311310 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" event={"ID":"355433c0-97b4-468e-bf51-d9fff62ff05f","Type":"ContainerStarted","Data":"495d5086b4dddb7add13a2d585b493609b36c7d899f7657311e7376de35241f6"} Mar 20 09:00:01 crc kubenswrapper[4903]: I0320 09:00:01.314348 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566620-l94tv" event={"ID":"72c92018-7fab-4cfa-a34f-9f6fb2dce3f5","Type":"ContainerStarted","Data":"cf8c6424a30c624811dc86ad5619855aa18abd5581a49d33e092d0baeea026be"} Mar 20 09:00:01 crc kubenswrapper[4903]: I0320 09:00:01.337982 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" podStartSLOduration=1.337964193 podStartE2EDuration="1.337964193s" podCreationTimestamp="2026-03-20 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:00:01.333162326 +0000 UTC m=+2226.550062651" watchObservedRunningTime="2026-03-20 09:00:01.337964193 +0000 UTC m=+2226.554864508" Mar 20 09:00:02 crc kubenswrapper[4903]: I0320 09:00:02.326811 4903 generic.go:334] "Generic (PLEG): container finished" podID="355433c0-97b4-468e-bf51-d9fff62ff05f" containerID="55f449b073c648c8945e7a5d53e5ab553193695687efd04e7931fc6dbecd91f2" exitCode=0 Mar 20 09:00:02 crc kubenswrapper[4903]: I0320 09:00:02.326866 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" event={"ID":"355433c0-97b4-468e-bf51-d9fff62ff05f","Type":"ContainerDied","Data":"55f449b073c648c8945e7a5d53e5ab553193695687efd04e7931fc6dbecd91f2"} Mar 20 09:00:03 crc kubenswrapper[4903]: I0320 09:00:03.678089 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:03 crc kubenswrapper[4903]: I0320 09:00:03.692927 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/355433c0-97b4-468e-bf51-d9fff62ff05f-secret-volume\") pod \"355433c0-97b4-468e-bf51-d9fff62ff05f\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " Mar 20 09:00:03 crc kubenswrapper[4903]: I0320 09:00:03.693122 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rhp4\" (UniqueName: \"kubernetes.io/projected/355433c0-97b4-468e-bf51-d9fff62ff05f-kube-api-access-4rhp4\") pod \"355433c0-97b4-468e-bf51-d9fff62ff05f\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " Mar 20 09:00:03 crc kubenswrapper[4903]: I0320 09:00:03.693175 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/355433c0-97b4-468e-bf51-d9fff62ff05f-config-volume\") pod \"355433c0-97b4-468e-bf51-d9fff62ff05f\" (UID: \"355433c0-97b4-468e-bf51-d9fff62ff05f\") " Mar 20 09:00:03 crc kubenswrapper[4903]: I0320 09:00:03.697512 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/355433c0-97b4-468e-bf51-d9fff62ff05f-config-volume" (OuterVolumeSpecName: "config-volume") pod "355433c0-97b4-468e-bf51-d9fff62ff05f" (UID: "355433c0-97b4-468e-bf51-d9fff62ff05f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:00:03 crc kubenswrapper[4903]: I0320 09:00:03.701372 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355433c0-97b4-468e-bf51-d9fff62ff05f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "355433c0-97b4-468e-bf51-d9fff62ff05f" (UID: "355433c0-97b4-468e-bf51-d9fff62ff05f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:00:03 crc kubenswrapper[4903]: I0320 09:00:03.704387 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355433c0-97b4-468e-bf51-d9fff62ff05f-kube-api-access-4rhp4" (OuterVolumeSpecName: "kube-api-access-4rhp4") pod "355433c0-97b4-468e-bf51-d9fff62ff05f" (UID: "355433c0-97b4-468e-bf51-d9fff62ff05f"). InnerVolumeSpecName "kube-api-access-4rhp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:03 crc kubenswrapper[4903]: I0320 09:00:03.794699 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rhp4\" (UniqueName: \"kubernetes.io/projected/355433c0-97b4-468e-bf51-d9fff62ff05f-kube-api-access-4rhp4\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:03 crc kubenswrapper[4903]: I0320 09:00:03.794737 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/355433c0-97b4-468e-bf51-d9fff62ff05f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:03 crc kubenswrapper[4903]: I0320 09:00:03.794749 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/355433c0-97b4-468e-bf51-d9fff62ff05f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:04 crc kubenswrapper[4903]: I0320 09:00:04.349713 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" event={"ID":"355433c0-97b4-468e-bf51-d9fff62ff05f","Type":"ContainerDied","Data":"495d5086b4dddb7add13a2d585b493609b36c7d899f7657311e7376de35241f6"} Mar 20 09:00:04 crc kubenswrapper[4903]: I0320 09:00:04.349771 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="495d5086b4dddb7add13a2d585b493609b36c7d899f7657311e7376de35241f6" Mar 20 09:00:04 crc kubenswrapper[4903]: I0320 09:00:04.350268 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-mgsqp" Mar 20 09:00:04 crc kubenswrapper[4903]: I0320 09:00:04.442768 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb"] Mar 20 09:00:04 crc kubenswrapper[4903]: I0320 09:00:04.451017 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566575-t4mvb"] Mar 20 09:00:05 crc kubenswrapper[4903]: I0320 09:00:05.508946 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d4db0f-b5bb-41d4-ae0d-0600e38892b1" path="/var/lib/kubelet/pods/32d4db0f-b5bb-41d4-ae0d-0600e38892b1/volumes" Mar 20 09:00:20 crc kubenswrapper[4903]: I0320 09:00:20.833775 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:00:20 crc kubenswrapper[4903]: I0320 09:00:20.834283 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:00:22 crc kubenswrapper[4903]: I0320 09:00:22.500754 4903 generic.go:334] "Generic (PLEG): container finished" podID="72c92018-7fab-4cfa-a34f-9f6fb2dce3f5" containerID="0ec0855e7e6e5912b10786185a650793736a63be48953dfc0702ae2a2e65d8d2" exitCode=0 Mar 20 09:00:22 crc kubenswrapper[4903]: I0320 09:00:22.500805 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566620-l94tv" event={"ID":"72c92018-7fab-4cfa-a34f-9f6fb2dce3f5","Type":"ContainerDied","Data":"0ec0855e7e6e5912b10786185a650793736a63be48953dfc0702ae2a2e65d8d2"} Mar 20 09:00:23 crc kubenswrapper[4903]: I0320 09:00:23.784849 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566620-l94tv" Mar 20 09:00:23 crc kubenswrapper[4903]: I0320 09:00:23.912696 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwjfr\" (UniqueName: \"kubernetes.io/projected/72c92018-7fab-4cfa-a34f-9f6fb2dce3f5-kube-api-access-wwjfr\") pod \"72c92018-7fab-4cfa-a34f-9f6fb2dce3f5\" (UID: \"72c92018-7fab-4cfa-a34f-9f6fb2dce3f5\") " Mar 20 09:00:23 crc kubenswrapper[4903]: I0320 09:00:23.924767 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c92018-7fab-4cfa-a34f-9f6fb2dce3f5-kube-api-access-wwjfr" (OuterVolumeSpecName: "kube-api-access-wwjfr") pod "72c92018-7fab-4cfa-a34f-9f6fb2dce3f5" (UID: "72c92018-7fab-4cfa-a34f-9f6fb2dce3f5"). InnerVolumeSpecName "kube-api-access-wwjfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:00:23 crc kubenswrapper[4903]: I0320 09:00:23.929960 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwjfr\" (UniqueName: \"kubernetes.io/projected/72c92018-7fab-4cfa-a34f-9f6fb2dce3f5-kube-api-access-wwjfr\") on node \"crc\" DevicePath \"\"" Mar 20 09:00:24 crc kubenswrapper[4903]: I0320 09:00:24.519546 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566620-l94tv" event={"ID":"72c92018-7fab-4cfa-a34f-9f6fb2dce3f5","Type":"ContainerDied","Data":"cf8c6424a30c624811dc86ad5619855aa18abd5581a49d33e092d0baeea026be"} Mar 20 09:00:24 crc kubenswrapper[4903]: I0320 09:00:24.519610 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf8c6424a30c624811dc86ad5619855aa18abd5581a49d33e092d0baeea026be" Mar 20 09:00:24 crc kubenswrapper[4903]: I0320 09:00:24.519707 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566620-l94tv" Mar 20 09:00:24 crc kubenswrapper[4903]: I0320 09:00:24.860330 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566614-jn9hf"] Mar 20 09:00:24 crc kubenswrapper[4903]: I0320 09:00:24.866869 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566614-jn9hf"] Mar 20 09:00:25 crc kubenswrapper[4903]: I0320 09:00:25.505654 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d84f4c2-e78c-47fa-b054-324606a19025" path="/var/lib/kubelet/pods/4d84f4c2-e78c-47fa-b054-324606a19025/volumes" Mar 20 09:00:50 crc kubenswrapper[4903]: I0320 09:00:50.834461 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:00:50 crc kubenswrapper[4903]: I0320 09:00:50.835010 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:01:02 crc kubenswrapper[4903]: I0320 09:01:02.093617 4903 scope.go:117] "RemoveContainer" containerID="886211f260ddc7606b6b1150cd5eb273e9a18c4f9250e304efeecdb3a9b83794" Mar 20 09:01:02 crc kubenswrapper[4903]: I0320 09:01:02.129916 4903 scope.go:117] "RemoveContainer" containerID="33276b6ec65c303260a6ac3a091a05f25e49d202c493ae9b040cd8f6ae75d66c" Mar 20 09:01:20 crc kubenswrapper[4903]: I0320 09:01:20.834258 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:01:20 crc kubenswrapper[4903]: I0320 09:01:20.836213 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:01:20 crc kubenswrapper[4903]: I0320 09:01:20.836591 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 09:01:20 crc kubenswrapper[4903]: I0320 09:01:20.837212 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6e6d4b852a87fa5948de40052e25925d5cd0b146939ba40c279384bb383c9f7"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:01:20 crc kubenswrapper[4903]: I0320 09:01:20.837268 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://f6e6d4b852a87fa5948de40052e25925d5cd0b146939ba40c279384bb383c9f7" gracePeriod=600 Mar 20 09:01:21 crc kubenswrapper[4903]: I0320 09:01:21.036851 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="f6e6d4b852a87fa5948de40052e25925d5cd0b146939ba40c279384bb383c9f7" exitCode=0 Mar 20 09:01:21 crc kubenswrapper[4903]: I0320 09:01:21.036892 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"f6e6d4b852a87fa5948de40052e25925d5cd0b146939ba40c279384bb383c9f7"} Mar 20 09:01:21 crc kubenswrapper[4903]: I0320 09:01:21.036929 4903 scope.go:117] "RemoveContainer" containerID="574e7fb2455d2017063433881e6f792136a3cc4bdc3d53ada47e32dd67839ac9" Mar 20 09:01:22 crc kubenswrapper[4903]: I0320 09:01:22.048420 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e"} Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.141590 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566622-z9qqt"] Mar 20 09:02:00 crc kubenswrapper[4903]: E0320 09:02:00.142472 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c92018-7fab-4cfa-a34f-9f6fb2dce3f5" containerName="oc" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.142488 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c92018-7fab-4cfa-a34f-9f6fb2dce3f5" containerName="oc" Mar 20 09:02:00 crc kubenswrapper[4903]: E0320 09:02:00.142515 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355433c0-97b4-468e-bf51-d9fff62ff05f" containerName="collect-profiles" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.142527 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="355433c0-97b4-468e-bf51-d9fff62ff05f" containerName="collect-profiles" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.142690 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c92018-7fab-4cfa-a34f-9f6fb2dce3f5" containerName="oc" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.142707 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="355433c0-97b4-468e-bf51-d9fff62ff05f" containerName="collect-profiles" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.143382 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-z9qqt" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.146386 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.146573 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.146920 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.147512 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-z9qqt"] Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.201528 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7jkm\" (UniqueName: \"kubernetes.io/projected/7bbac0c1-d921-4c1e-b1f6-71a95106d7b2-kube-api-access-m7jkm\") pod \"auto-csr-approver-29566622-z9qqt\" (UID: \"7bbac0c1-d921-4c1e-b1f6-71a95106d7b2\") " pod="openshift-infra/auto-csr-approver-29566622-z9qqt" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.303157 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7jkm\" (UniqueName: \"kubernetes.io/projected/7bbac0c1-d921-4c1e-b1f6-71a95106d7b2-kube-api-access-m7jkm\") pod \"auto-csr-approver-29566622-z9qqt\" (UID: \"7bbac0c1-d921-4c1e-b1f6-71a95106d7b2\") " pod="openshift-infra/auto-csr-approver-29566622-z9qqt" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.328967 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7jkm\" (UniqueName: \"kubernetes.io/projected/7bbac0c1-d921-4c1e-b1f6-71a95106d7b2-kube-api-access-m7jkm\") pod \"auto-csr-approver-29566622-z9qqt\" (UID: \"7bbac0c1-d921-4c1e-b1f6-71a95106d7b2\") " pod="openshift-infra/auto-csr-approver-29566622-z9qqt" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.458353 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-z9qqt" Mar 20 09:02:00 crc kubenswrapper[4903]: I0320 09:02:00.912608 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-z9qqt"] Mar 20 09:02:01 crc kubenswrapper[4903]: I0320 09:02:01.396922 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566622-z9qqt" event={"ID":"7bbac0c1-d921-4c1e-b1f6-71a95106d7b2","Type":"ContainerStarted","Data":"70bf8a7667ae3a1e9b50eaaf869644c431a4881fd969b990d98107f241ad7a31"} Mar 20 09:02:03 crc kubenswrapper[4903]: I0320 09:02:03.420220 4903 generic.go:334] "Generic (PLEG): container finished" podID="7bbac0c1-d921-4c1e-b1f6-71a95106d7b2" containerID="b0a7e6581ddd2a0b15f8a575486f6fc52f00a4d4f9d16b9f725624a740a38b0d" exitCode=0 Mar 20 09:02:03 crc kubenswrapper[4903]: I0320 09:02:03.420454 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566622-z9qqt" event={"ID":"7bbac0c1-d921-4c1e-b1f6-71a95106d7b2","Type":"ContainerDied","Data":"b0a7e6581ddd2a0b15f8a575486f6fc52f00a4d4f9d16b9f725624a740a38b0d"} Mar 20 09:02:04 crc kubenswrapper[4903]: I0320 09:02:04.753197 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-z9qqt" Mar 20 09:02:04 crc kubenswrapper[4903]: I0320 09:02:04.871168 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7jkm\" (UniqueName: \"kubernetes.io/projected/7bbac0c1-d921-4c1e-b1f6-71a95106d7b2-kube-api-access-m7jkm\") pod \"7bbac0c1-d921-4c1e-b1f6-71a95106d7b2\" (UID: \"7bbac0c1-d921-4c1e-b1f6-71a95106d7b2\") " Mar 20 09:02:04 crc kubenswrapper[4903]: I0320 09:02:04.878812 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbac0c1-d921-4c1e-b1f6-71a95106d7b2-kube-api-access-m7jkm" (OuterVolumeSpecName: "kube-api-access-m7jkm") pod "7bbac0c1-d921-4c1e-b1f6-71a95106d7b2" (UID: "7bbac0c1-d921-4c1e-b1f6-71a95106d7b2"). InnerVolumeSpecName "kube-api-access-m7jkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:04 crc kubenswrapper[4903]: I0320 09:02:04.973153 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7jkm\" (UniqueName: \"kubernetes.io/projected/7bbac0c1-d921-4c1e-b1f6-71a95106d7b2-kube-api-access-m7jkm\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:05 crc kubenswrapper[4903]: I0320 09:02:05.439025 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566622-z9qqt" event={"ID":"7bbac0c1-d921-4c1e-b1f6-71a95106d7b2","Type":"ContainerDied","Data":"70bf8a7667ae3a1e9b50eaaf869644c431a4881fd969b990d98107f241ad7a31"} Mar 20 09:02:05 crc kubenswrapper[4903]: I0320 09:02:05.439480 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70bf8a7667ae3a1e9b50eaaf869644c431a4881fd969b990d98107f241ad7a31" Mar 20 09:02:05 crc kubenswrapper[4903]: I0320 09:02:05.439559 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-z9qqt" Mar 20 09:02:05 crc kubenswrapper[4903]: I0320 09:02:05.823993 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566616-9b65m"] Mar 20 09:02:05 crc kubenswrapper[4903]: I0320 09:02:05.832263 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566616-9b65m"] Mar 20 09:02:07 crc kubenswrapper[4903]: I0320 09:02:07.499021 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83d46482-f640-42c9-b3a5-19836260cfd7" path="/var/lib/kubelet/pods/83d46482-f640-42c9-b3a5-19836260cfd7/volumes" Mar 20 09:03:02 crc kubenswrapper[4903]: I0320 09:03:02.273957 4903 scope.go:117] "RemoveContainer" containerID="a876f92a06bbb4fd38107456e393a57f955a8e14b570d6905a4f4fd7d947e72c" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.276587 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z84mc"] Mar 20 09:03:50 crc kubenswrapper[4903]: E0320 09:03:50.277770 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbac0c1-d921-4c1e-b1f6-71a95106d7b2" containerName="oc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.277788 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbac0c1-d921-4c1e-b1f6-71a95106d7b2" containerName="oc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.277966 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbac0c1-d921-4c1e-b1f6-71a95106d7b2" containerName="oc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.278920 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.295958 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z84mc"] Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.371176 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-catalog-content\") pod \"redhat-marketplace-z84mc\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.371275 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sjq5\" (UniqueName: \"kubernetes.io/projected/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-kube-api-access-8sjq5\") pod \"redhat-marketplace-z84mc\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.371314 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-utilities\") pod \"redhat-marketplace-z84mc\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.472767 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-catalog-content\") pod \"redhat-marketplace-z84mc\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.472841 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sjq5\" (UniqueName: \"kubernetes.io/projected/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-kube-api-access-8sjq5\") pod \"redhat-marketplace-z84mc\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.472875 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-utilities\") pod \"redhat-marketplace-z84mc\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.473437 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-utilities\") pod \"redhat-marketplace-z84mc\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.473564 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-catalog-content\") pod \"redhat-marketplace-z84mc\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.511764 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sjq5\" (UniqueName: \"kubernetes.io/projected/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-kube-api-access-8sjq5\") pod \"redhat-marketplace-z84mc\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.603393 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.824203 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z84mc"] Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.833199 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:03:50 crc kubenswrapper[4903]: I0320 09:03:50.833414 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:03:51 crc kubenswrapper[4903]: I0320 09:03:51.353679 4903 generic.go:334] "Generic (PLEG): container finished" podID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerID="b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e" exitCode=0 Mar 20 09:03:51 crc kubenswrapper[4903]: I0320 09:03:51.353740 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84mc" event={"ID":"3127256c-aab4-4aa1-aa39-856c3cc7e2e9","Type":"ContainerDied","Data":"b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e"} Mar 20 09:03:51 crc kubenswrapper[4903]: I0320 09:03:51.353780 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84mc" event={"ID":"3127256c-aab4-4aa1-aa39-856c3cc7e2e9","Type":"ContainerStarted","Data":"3602cbb1d99a81100aa1430242950b729727ebc04e36aa7d3b26f9b429ae9791"} Mar 20 09:03:52 crc kubenswrapper[4903]: I0320 09:03:52.367474 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84mc" event={"ID":"3127256c-aab4-4aa1-aa39-856c3cc7e2e9","Type":"ContainerStarted","Data":"ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7"} Mar 20 09:03:53 crc kubenswrapper[4903]: I0320 09:03:53.395109 4903 generic.go:334] "Generic (PLEG): container finished" podID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerID="ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7" exitCode=0 Mar 20 09:03:53 crc kubenswrapper[4903]: I0320 09:03:53.395175 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84mc" event={"ID":"3127256c-aab4-4aa1-aa39-856c3cc7e2e9","Type":"ContainerDied","Data":"ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7"} Mar 20 09:03:53 crc kubenswrapper[4903]: I0320 09:03:53.395518 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84mc" event={"ID":"3127256c-aab4-4aa1-aa39-856c3cc7e2e9","Type":"ContainerStarted","Data":"23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db"} Mar 20 09:03:53 crc kubenswrapper[4903]: I0320 09:03:53.425322 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z84mc" podStartSLOduration=1.993740412 podStartE2EDuration="3.425296801s" podCreationTimestamp="2026-03-20 09:03:50 +0000 UTC" firstStartedPulling="2026-03-20 09:03:51.355543684 +0000 UTC m=+2456.572444009" lastFinishedPulling="2026-03-20 09:03:52.787100073 +0000 UTC m=+2458.004000398" observedRunningTime="2026-03-20 09:03:53.418363602 +0000 UTC m=+2458.635263917" watchObservedRunningTime="2026-03-20 09:03:53.425296801 +0000 UTC m=+2458.642197126" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.149312 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566624-tdb9h"] Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.150772 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-tdb9h" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.152493 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.153850 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.154396 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.164409 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-tdb9h"] Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.231082 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5g8h\" (UniqueName: \"kubernetes.io/projected/f3d50331-fa87-4cf2-b383-9789dabee3ab-kube-api-access-k5g8h\") pod \"auto-csr-approver-29566624-tdb9h\" (UID: \"f3d50331-fa87-4cf2-b383-9789dabee3ab\") " pod="openshift-infra/auto-csr-approver-29566624-tdb9h" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.332219 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5g8h\" (UniqueName: \"kubernetes.io/projected/f3d50331-fa87-4cf2-b383-9789dabee3ab-kube-api-access-k5g8h\") pod \"auto-csr-approver-29566624-tdb9h\" (UID: \"f3d50331-fa87-4cf2-b383-9789dabee3ab\") " pod="openshift-infra/auto-csr-approver-29566624-tdb9h" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.354235 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5g8h\" (UniqueName: \"kubernetes.io/projected/f3d50331-fa87-4cf2-b383-9789dabee3ab-kube-api-access-k5g8h\") pod \"auto-csr-approver-29566624-tdb9h\" (UID: \"f3d50331-fa87-4cf2-b383-9789dabee3ab\") " pod="openshift-infra/auto-csr-approver-29566624-tdb9h" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.470003 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-tdb9h" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.605163 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.605551 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.670976 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:04:00 crc kubenswrapper[4903]: I0320 09:04:00.895481 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-tdb9h"] Mar 20 09:04:01 crc kubenswrapper[4903]: I0320 09:04:01.461249 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-tdb9h" event={"ID":"f3d50331-fa87-4cf2-b383-9789dabee3ab","Type":"ContainerStarted","Data":"4202b4fd98714e4ca71e2c88917fd565a35320cd152e791d906d6b093411e4c2"} Mar 20 09:04:01 crc kubenswrapper[4903]: I0320 09:04:01.508257 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:04:01 crc kubenswrapper[4903]: I0320 09:04:01.570902 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z84mc"] Mar 20 09:04:02 crc kubenswrapper[4903]: I0320 09:04:02.471236 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-tdb9h" event={"ID":"f3d50331-fa87-4cf2-b383-9789dabee3ab","Type":"ContainerStarted","Data":"d16110c8ecb2e92935b34a56173810487ee5586dec579ac78037ba27b3c62e1f"} Mar 20 09:04:02 crc kubenswrapper[4903]: I0320 09:04:02.485061 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566624-tdb9h" podStartSLOduration=1.3909037340000001 podStartE2EDuration="2.485043805s" podCreationTimestamp="2026-03-20 09:04:00 +0000 UTC" firstStartedPulling="2026-03-20 09:04:00.908750313 +0000 UTC m=+2466.125650628" lastFinishedPulling="2026-03-20 09:04:02.002890384 +0000 UTC m=+2467.219790699" observedRunningTime="2026-03-20 09:04:02.482488613 +0000 UTC m=+2467.699388938" watchObservedRunningTime="2026-03-20 09:04:02.485043805 +0000 UTC m=+2467.701944120" Mar 20 09:04:03 crc kubenswrapper[4903]: I0320 09:04:03.479476 4903 generic.go:334] "Generic (PLEG): container finished" podID="f3d50331-fa87-4cf2-b383-9789dabee3ab" containerID="d16110c8ecb2e92935b34a56173810487ee5586dec579ac78037ba27b3c62e1f" exitCode=0 Mar 20 09:04:03 crc kubenswrapper[4903]: I0320 09:04:03.479615 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-tdb9h" event={"ID":"f3d50331-fa87-4cf2-b383-9789dabee3ab","Type":"ContainerDied","Data":"d16110c8ecb2e92935b34a56173810487ee5586dec579ac78037ba27b3c62e1f"} Mar 20 09:04:03 crc kubenswrapper[4903]: I0320 09:04:03.479670 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z84mc" podUID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerName="registry-server" containerID="cri-o://23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db" gracePeriod=2 Mar 20 09:04:03 crc kubenswrapper[4903]: I0320 09:04:03.881645 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:04:03 crc kubenswrapper[4903]: I0320 09:04:03.986556 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-utilities\") pod \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " Mar 20 09:04:03 crc kubenswrapper[4903]: I0320 09:04:03.986796 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-catalog-content\") pod \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " Mar 20 09:04:03 crc kubenswrapper[4903]: I0320 09:04:03.986858 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sjq5\" (UniqueName: \"kubernetes.io/projected/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-kube-api-access-8sjq5\") pod \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\" (UID: \"3127256c-aab4-4aa1-aa39-856c3cc7e2e9\") " Mar 20 09:04:03 crc kubenswrapper[4903]: I0320 09:04:03.989829 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-utilities" (OuterVolumeSpecName: "utilities") pod "3127256c-aab4-4aa1-aa39-856c3cc7e2e9" (UID: "3127256c-aab4-4aa1-aa39-856c3cc7e2e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:04:03 crc kubenswrapper[4903]: I0320 09:04:03.995053 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-kube-api-access-8sjq5" (OuterVolumeSpecName: "kube-api-access-8sjq5") pod "3127256c-aab4-4aa1-aa39-856c3cc7e2e9" (UID: "3127256c-aab4-4aa1-aa39-856c3cc7e2e9"). InnerVolumeSpecName "kube-api-access-8sjq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.018693 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3127256c-aab4-4aa1-aa39-856c3cc7e2e9" (UID: "3127256c-aab4-4aa1-aa39-856c3cc7e2e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.088462 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.088498 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sjq5\" (UniqueName: \"kubernetes.io/projected/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-kube-api-access-8sjq5\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.088508 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3127256c-aab4-4aa1-aa39-856c3cc7e2e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.492936 4903 generic.go:334] "Generic (PLEG): container finished" podID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerID="23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db" exitCode=0 Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.493855 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z84mc" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.494181 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84mc" event={"ID":"3127256c-aab4-4aa1-aa39-856c3cc7e2e9","Type":"ContainerDied","Data":"23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db"} Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.494222 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z84mc" event={"ID":"3127256c-aab4-4aa1-aa39-856c3cc7e2e9","Type":"ContainerDied","Data":"3602cbb1d99a81100aa1430242950b729727ebc04e36aa7d3b26f9b429ae9791"} Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.494239 4903 scope.go:117] "RemoveContainer" containerID="23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.526948 4903 scope.go:117] "RemoveContainer" containerID="ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.547666 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z84mc"] Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.555436 4903 scope.go:117] "RemoveContainer" containerID="b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.557262 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z84mc"] Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.610980 4903 scope.go:117] "RemoveContainer" containerID="23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db" Mar 20 09:04:04 crc kubenswrapper[4903]: E0320 09:04:04.611680 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db\": container with ID starting with 23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db not found: ID does not exist" containerID="23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.611704 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db"} err="failed to get container status \"23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db\": rpc error: code = NotFound desc = could not find container \"23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db\": container with ID starting with 23194ebfbe19d995c1ccc7e21042adf6927336f7a89b75170141b7687442a5db not found: ID does not exist" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.611724 4903 scope.go:117] "RemoveContainer" containerID="ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7" Mar 20 09:04:04 crc kubenswrapper[4903]: E0320 09:04:04.612472 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7\": container with ID starting with ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7 not found: ID does not exist" containerID="ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.612524 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7"} err="failed to get container status \"ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7\": rpc error: code = NotFound desc = could not find container \"ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7\": container with ID starting with ecf878400bf03756aa9f8cbf32b0823afdd8dba4700a1413fbd31ebd791985d7 not found: ID does not exist" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.612560 4903 scope.go:117] "RemoveContainer" containerID="b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e" Mar 20 09:04:04 crc kubenswrapper[4903]: E0320 09:04:04.613926 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e\": container with ID starting with b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e not found: ID does not exist" containerID="b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.613967 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e"} err="failed to get container status \"b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e\": rpc error: code = NotFound desc = could not find container \"b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e\": container with ID starting with b656f204ba3a626bce376db8e95e1aa19ba61e3cd9d3c4dde37c14ce74533c2e not found: ID does not exist" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.805586 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-tdb9h" Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.918346 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5g8h\" (UniqueName: \"kubernetes.io/projected/f3d50331-fa87-4cf2-b383-9789dabee3ab-kube-api-access-k5g8h\") pod \"f3d50331-fa87-4cf2-b383-9789dabee3ab\" (UID: \"f3d50331-fa87-4cf2-b383-9789dabee3ab\") " Mar 20 09:04:04 crc kubenswrapper[4903]: I0320 09:04:04.923678 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d50331-fa87-4cf2-b383-9789dabee3ab-kube-api-access-k5g8h" (OuterVolumeSpecName: "kube-api-access-k5g8h") pod "f3d50331-fa87-4cf2-b383-9789dabee3ab" (UID: "f3d50331-fa87-4cf2-b383-9789dabee3ab"). InnerVolumeSpecName "kube-api-access-k5g8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:05 crc kubenswrapper[4903]: I0320 09:04:05.019672 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5g8h\" (UniqueName: \"kubernetes.io/projected/f3d50331-fa87-4cf2-b383-9789dabee3ab-kube-api-access-k5g8h\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:05 crc kubenswrapper[4903]: I0320 09:04:05.507096 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" path="/var/lib/kubelet/pods/3127256c-aab4-4aa1-aa39-856c3cc7e2e9/volumes" Mar 20 09:04:05 crc kubenswrapper[4903]: I0320 09:04:05.507189 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-tdb9h" Mar 20 09:04:05 crc kubenswrapper[4903]: I0320 09:04:05.510599 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-tdb9h" event={"ID":"f3d50331-fa87-4cf2-b383-9789dabee3ab","Type":"ContainerDied","Data":"4202b4fd98714e4ca71e2c88917fd565a35320cd152e791d906d6b093411e4c2"} Mar 20 09:04:05 crc kubenswrapper[4903]: I0320 09:04:05.510810 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4202b4fd98714e4ca71e2c88917fd565a35320cd152e791d906d6b093411e4c2" Mar 20 09:04:05 crc kubenswrapper[4903]: I0320 09:04:05.578788 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566618-n2zwf"] Mar 20 09:04:05 crc kubenswrapper[4903]: I0320 09:04:05.584272 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566618-n2zwf"] Mar 20 09:04:07 crc kubenswrapper[4903]: I0320 09:04:07.501021 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11f1777a-fc60-4e30-8eca-fff1aa4776b0" path="/var/lib/kubelet/pods/11f1777a-fc60-4e30-8eca-fff1aa4776b0/volumes" Mar 20 09:04:20 crc kubenswrapper[4903]: I0320 09:04:20.833470 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:04:20 crc kubenswrapper[4903]: I0320 09:04:20.834419 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:04:50 crc kubenswrapper[4903]: I0320 09:04:50.834374 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:04:50 crc kubenswrapper[4903]: I0320 09:04:50.834903 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:04:50 crc kubenswrapper[4903]: I0320 09:04:50.834956 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 09:04:50 crc kubenswrapper[4903]: I0320 09:04:50.870952 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:04:50 crc kubenswrapper[4903]: I0320 09:04:50.871112 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" gracePeriod=600 Mar 20 09:04:51 crc kubenswrapper[4903]: E0320 09:04:51.000821 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:04:51 crc kubenswrapper[4903]: I0320 09:04:51.878703 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" exitCode=0 Mar 20 09:04:51 crc kubenswrapper[4903]: I0320 09:04:51.878744 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e"} Mar 20 09:04:51 crc kubenswrapper[4903]: I0320 09:04:51.878823 4903 scope.go:117] "RemoveContainer" containerID="f6e6d4b852a87fa5948de40052e25925d5cd0b146939ba40c279384bb383c9f7" Mar 20 09:04:51 crc kubenswrapper[4903]: I0320 09:04:51.879675 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:04:51 crc kubenswrapper[4903]: E0320 09:04:51.880231 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:05:02 crc kubenswrapper[4903]: I0320 09:05:02.369575 4903 scope.go:117] "RemoveContainer" containerID="a95ac29f20207be00f5e9608ad1855f7e8d25dc8c89bbb819dd71adc02e2596b" Mar 20 09:05:03 crc kubenswrapper[4903]: I0320 09:05:03.491292 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:05:03 crc kubenswrapper[4903]: E0320 09:05:03.491772 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.101154 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m9mt9"] Mar 20 09:05:15 crc kubenswrapper[4903]: E0320 09:05:15.102413 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerName="extract-content" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.102442 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerName="extract-content" Mar 20 09:05:15 crc kubenswrapper[4903]: E0320 09:05:15.102463 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerName="registry-server" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.102477 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerName="registry-server" Mar 20 09:05:15 crc kubenswrapper[4903]: E0320 09:05:15.102520 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerName="extract-utilities" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.102542 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerName="extract-utilities" Mar 20 09:05:15 crc kubenswrapper[4903]: E0320 09:05:15.102580 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d50331-fa87-4cf2-b383-9789dabee3ab" containerName="oc" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.102597 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d50331-fa87-4cf2-b383-9789dabee3ab" containerName="oc" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.102894 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="3127256c-aab4-4aa1-aa39-856c3cc7e2e9" containerName="registry-server" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.102929 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d50331-fa87-4cf2-b383-9789dabee3ab" containerName="oc" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.105104 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.108962 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b7ks\" (UniqueName: \"kubernetes.io/projected/bf22bc02-075a-400b-993e-1ef6b23f787b-kube-api-access-9b7ks\") pod \"redhat-operators-m9mt9\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.109151 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-catalog-content\") pod \"redhat-operators-m9mt9\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.109529 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-utilities\") pod \"redhat-operators-m9mt9\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.121777 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9mt9"] Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.211228 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b7ks\" (UniqueName: \"kubernetes.io/projected/bf22bc02-075a-400b-993e-1ef6b23f787b-kube-api-access-9b7ks\") pod \"redhat-operators-m9mt9\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.211295 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-catalog-content\") pod \"redhat-operators-m9mt9\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.211352 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-utilities\") pod \"redhat-operators-m9mt9\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.211873 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-utilities\") pod \"redhat-operators-m9mt9\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.211912 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-catalog-content\") pod \"redhat-operators-m9mt9\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.230663 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b7ks\" (UniqueName: \"kubernetes.io/projected/bf22bc02-075a-400b-993e-1ef6b23f787b-kube-api-access-9b7ks\") pod \"redhat-operators-m9mt9\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.444922 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:15 crc kubenswrapper[4903]: I0320 09:05:15.922875 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m9mt9"] Mar 20 09:05:16 crc kubenswrapper[4903]: I0320 09:05:16.065896 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9mt9" event={"ID":"bf22bc02-075a-400b-993e-1ef6b23f787b","Type":"ContainerStarted","Data":"0ec691fc09c9f679917e658e4fc78900e6c779373cfee05cf2d953d1bb72c70e"} Mar 20 09:05:17 crc kubenswrapper[4903]: I0320 09:05:17.097446 4903 generic.go:334] "Generic (PLEG): container finished" podID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerID="5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7" exitCode=0 Mar 20 09:05:17 crc kubenswrapper[4903]: I0320 09:05:17.097600 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9mt9" event={"ID":"bf22bc02-075a-400b-993e-1ef6b23f787b","Type":"ContainerDied","Data":"5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7"} Mar 20 09:05:17 crc kubenswrapper[4903]: I0320 09:05:17.099436 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:05:18 crc kubenswrapper[4903]: I0320 09:05:18.109009 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9mt9" event={"ID":"bf22bc02-075a-400b-993e-1ef6b23f787b","Type":"ContainerStarted","Data":"109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a"} Mar 20 09:05:18 crc kubenswrapper[4903]: I0320 09:05:18.490939 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:05:18 crc kubenswrapper[4903]: E0320 09:05:18.491368 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:05:19 crc kubenswrapper[4903]: I0320 09:05:19.118074 4903 generic.go:334] "Generic (PLEG): container finished" podID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerID="109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a" exitCode=0 Mar 20 09:05:19 crc kubenswrapper[4903]: I0320 09:05:19.118127 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9mt9" event={"ID":"bf22bc02-075a-400b-993e-1ef6b23f787b","Type":"ContainerDied","Data":"109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a"} Mar 20 09:05:20 crc kubenswrapper[4903]: I0320 09:05:20.126513 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9mt9" event={"ID":"bf22bc02-075a-400b-993e-1ef6b23f787b","Type":"ContainerStarted","Data":"fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391"} Mar 20 09:05:20 crc kubenswrapper[4903]: I0320 09:05:20.149909 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m9mt9" podStartSLOduration=2.737534787 podStartE2EDuration="5.149892356s" podCreationTimestamp="2026-03-20 09:05:15 +0000 UTC" firstStartedPulling="2026-03-20 09:05:17.099164316 +0000 UTC m=+2542.316064631" lastFinishedPulling="2026-03-20 09:05:19.511521885 +0000 UTC m=+2544.728422200" observedRunningTime="2026-03-20 09:05:20.143747938 +0000 UTC m=+2545.360648293" watchObservedRunningTime="2026-03-20 09:05:20.149892356 +0000 UTC m=+2545.366792671" Mar 20 09:05:25 crc kubenswrapper[4903]: I0320 09:05:25.445284 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:25 crc kubenswrapper[4903]: I0320 09:05:25.445792 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:26 crc kubenswrapper[4903]: I0320 09:05:26.507154 4903 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m9mt9" podUID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerName="registry-server" probeResult="failure" output=< Mar 20 09:05:26 crc kubenswrapper[4903]: timeout: failed to connect service ":50051" within 1s Mar 20 09:05:26 crc kubenswrapper[4903]: > Mar 20 09:05:30 crc kubenswrapper[4903]: I0320 09:05:30.491415 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:05:30 crc kubenswrapper[4903]: E0320 09:05:30.492068 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:05:35 crc kubenswrapper[4903]: I0320 09:05:35.508869 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:35 crc kubenswrapper[4903]: I0320 09:05:35.563494 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:35 crc kubenswrapper[4903]: I0320 09:05:35.758622 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9mt9"] Mar 20 09:05:37 crc kubenswrapper[4903]: I0320 09:05:37.257252 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m9mt9" podUID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerName="registry-server" containerID="cri-o://fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391" gracePeriod=2 Mar 20 09:05:37 crc kubenswrapper[4903]: I0320 09:05:37.697055 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:37 crc kubenswrapper[4903]: I0320 09:05:37.854316 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-utilities\") pod \"bf22bc02-075a-400b-993e-1ef6b23f787b\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " Mar 20 09:05:37 crc kubenswrapper[4903]: I0320 09:05:37.854395 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-catalog-content\") pod \"bf22bc02-075a-400b-993e-1ef6b23f787b\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " Mar 20 09:05:37 crc kubenswrapper[4903]: I0320 09:05:37.854578 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b7ks\" (UniqueName: \"kubernetes.io/projected/bf22bc02-075a-400b-993e-1ef6b23f787b-kube-api-access-9b7ks\") pod \"bf22bc02-075a-400b-993e-1ef6b23f787b\" (UID: \"bf22bc02-075a-400b-993e-1ef6b23f787b\") " Mar 20 09:05:37 crc kubenswrapper[4903]: I0320 09:05:37.857677 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-utilities" (OuterVolumeSpecName: "utilities") pod "bf22bc02-075a-400b-993e-1ef6b23f787b" (UID: "bf22bc02-075a-400b-993e-1ef6b23f787b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:05:37 crc kubenswrapper[4903]: I0320 09:05:37.860545 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf22bc02-075a-400b-993e-1ef6b23f787b-kube-api-access-9b7ks" (OuterVolumeSpecName: "kube-api-access-9b7ks") pod "bf22bc02-075a-400b-993e-1ef6b23f787b" (UID: "bf22bc02-075a-400b-993e-1ef6b23f787b"). InnerVolumeSpecName "kube-api-access-9b7ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:37 crc kubenswrapper[4903]: I0320 09:05:37.955891 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b7ks\" (UniqueName: \"kubernetes.io/projected/bf22bc02-075a-400b-993e-1ef6b23f787b-kube-api-access-9b7ks\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:37 crc kubenswrapper[4903]: I0320 09:05:37.955920 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.039023 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf22bc02-075a-400b-993e-1ef6b23f787b" (UID: "bf22bc02-075a-400b-993e-1ef6b23f787b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.056924 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf22bc02-075a-400b-993e-1ef6b23f787b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.266892 4903 generic.go:334] "Generic (PLEG): container finished" podID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerID="fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391" exitCode=0 Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.266951 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9mt9" event={"ID":"bf22bc02-075a-400b-993e-1ef6b23f787b","Type":"ContainerDied","Data":"fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391"} Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.266978 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m9mt9" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.267001 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m9mt9" event={"ID":"bf22bc02-075a-400b-993e-1ef6b23f787b","Type":"ContainerDied","Data":"0ec691fc09c9f679917e658e4fc78900e6c779373cfee05cf2d953d1bb72c70e"} Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.267026 4903 scope.go:117] "RemoveContainer" containerID="fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.297451 4903 scope.go:117] "RemoveContainer" containerID="109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.313067 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m9mt9"] Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.321510 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m9mt9"] Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.335982 4903 scope.go:117] "RemoveContainer" containerID="5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.358232 4903 scope.go:117] "RemoveContainer" containerID="fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391" Mar 20 09:05:38 crc kubenswrapper[4903]: E0320 09:05:38.358657 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391\": container with ID starting with fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391 not found: ID does not exist" containerID="fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.358705 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391"} err="failed to get container status \"fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391\": rpc error: code = NotFound desc = could not find container \"fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391\": container with ID starting with fbd215dd64b716cc81718e55a10b8ce62fc4dc6c461081c9b69739b443854391 not found: ID does not exist" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.358732 4903 scope.go:117] "RemoveContainer" containerID="109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a" Mar 20 09:05:38 crc kubenswrapper[4903]: E0320 09:05:38.359274 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a\": container with ID starting with 109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a not found: ID does not exist" containerID="109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.359296 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a"} err="failed to get container status \"109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a\": rpc error: code = NotFound desc = could not find container \"109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a\": container with ID starting with 109df923d7c79383fb3700a93a3f0ae51c00ba706736128e3d802a92bb034b1a not found: ID does not exist" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.359315 4903 scope.go:117] "RemoveContainer" containerID="5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7" Mar 20 09:05:38 crc kubenswrapper[4903]: E0320 09:05:38.359838 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7\": container with ID starting with 5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7 not found: ID does not exist" containerID="5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7" Mar 20 09:05:38 crc kubenswrapper[4903]: I0320 09:05:38.359870 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7"} err="failed to get container status \"5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7\": rpc error: code = NotFound desc = could not find container \"5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7\": container with ID starting with 5fe768fe988b9f986763567e90aa7dcb115587a65a8110df6e6e2604388544c7 not found: ID does not exist" Mar 20 09:05:39 crc kubenswrapper[4903]: I0320 09:05:39.503753 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf22bc02-075a-400b-993e-1ef6b23f787b" path="/var/lib/kubelet/pods/bf22bc02-075a-400b-993e-1ef6b23f787b/volumes" Mar 20 09:05:45 crc kubenswrapper[4903]: I0320 09:05:45.499399 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:05:45 crc kubenswrapper[4903]: E0320 09:05:45.500661 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:05:58 crc kubenswrapper[4903]: I0320 09:05:58.491202 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:05:58 crc kubenswrapper[4903]: E0320 09:05:58.492565 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.153580 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566626-srgr2"] Mar 20 09:06:00 crc kubenswrapper[4903]: E0320 09:06:00.154295 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerName="registry-server" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.154310 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerName="registry-server" Mar 20 09:06:00 crc kubenswrapper[4903]: E0320 09:06:00.154328 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerName="extract-content" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.154334 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerName="extract-content" Mar 20 09:06:00 crc kubenswrapper[4903]: E0320 09:06:00.154357 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerName="extract-utilities" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.154363 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerName="extract-utilities" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.154500 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf22bc02-075a-400b-993e-1ef6b23f787b" containerName="registry-server" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.154967 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-srgr2" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.160158 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.160595 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.161103 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.171680 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-srgr2"] Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.190046 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wpbv\" (UniqueName: \"kubernetes.io/projected/7726b069-cc88-4bf4-9e9f-f101d82832bf-kube-api-access-4wpbv\") pod \"auto-csr-approver-29566626-srgr2\" (UID: \"7726b069-cc88-4bf4-9e9f-f101d82832bf\") " pod="openshift-infra/auto-csr-approver-29566626-srgr2" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.290966 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpbv\" (UniqueName: \"kubernetes.io/projected/7726b069-cc88-4bf4-9e9f-f101d82832bf-kube-api-access-4wpbv\") pod \"auto-csr-approver-29566626-srgr2\" (UID: \"7726b069-cc88-4bf4-9e9f-f101d82832bf\") " pod="openshift-infra/auto-csr-approver-29566626-srgr2" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.326081 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wpbv\" (UniqueName: \"kubernetes.io/projected/7726b069-cc88-4bf4-9e9f-f101d82832bf-kube-api-access-4wpbv\") pod \"auto-csr-approver-29566626-srgr2\" (UID: \"7726b069-cc88-4bf4-9e9f-f101d82832bf\") " pod="openshift-infra/auto-csr-approver-29566626-srgr2" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.476100 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-srgr2" Mar 20 09:06:00 crc kubenswrapper[4903]: I0320 09:06:00.909119 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-srgr2"] Mar 20 09:06:01 crc kubenswrapper[4903]: I0320 09:06:01.484672 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566626-srgr2" event={"ID":"7726b069-cc88-4bf4-9e9f-f101d82832bf","Type":"ContainerStarted","Data":"683e3aa3921908ee3305d03f79f7c71bd25a808e327ad48aa425cfa626bc1252"} Mar 20 09:06:02 crc kubenswrapper[4903]: I0320 09:06:02.492733 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566626-srgr2" event={"ID":"7726b069-cc88-4bf4-9e9f-f101d82832bf","Type":"ContainerStarted","Data":"7bdfefe0f6b0876d8f983c2b3efb6d418b2a7de448d7509c621f602eb3328cbf"} Mar 20 09:06:02 crc kubenswrapper[4903]: I0320 09:06:02.510199 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566626-srgr2" podStartSLOduration=1.3425361740000001 podStartE2EDuration="2.510180062s" podCreationTimestamp="2026-03-20 09:06:00 +0000 UTC" firstStartedPulling="2026-03-20 09:06:00.920228299 +0000 UTC m=+2586.137128654" lastFinishedPulling="2026-03-20 09:06:02.087872217 +0000 UTC m=+2587.304772542" observedRunningTime="2026-03-20 09:06:02.505054429 +0000 UTC m=+2587.721954744" watchObservedRunningTime="2026-03-20 09:06:02.510180062 +0000 UTC m=+2587.727080377" Mar 20 09:06:03 crc kubenswrapper[4903]: I0320 09:06:03.501793 4903 generic.go:334] "Generic (PLEG): container finished" podID="7726b069-cc88-4bf4-9e9f-f101d82832bf" containerID="7bdfefe0f6b0876d8f983c2b3efb6d418b2a7de448d7509c621f602eb3328cbf" exitCode=0 Mar 20 09:06:03 crc kubenswrapper[4903]: I0320 09:06:03.501833 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566626-srgr2" event={"ID":"7726b069-cc88-4bf4-9e9f-f101d82832bf","Type":"ContainerDied","Data":"7bdfefe0f6b0876d8f983c2b3efb6d418b2a7de448d7509c621f602eb3328cbf"} Mar 20 09:06:04 crc kubenswrapper[4903]: I0320 09:06:04.812733 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-srgr2" Mar 20 09:06:04 crc kubenswrapper[4903]: I0320 09:06:04.962850 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wpbv\" (UniqueName: \"kubernetes.io/projected/7726b069-cc88-4bf4-9e9f-f101d82832bf-kube-api-access-4wpbv\") pod \"7726b069-cc88-4bf4-9e9f-f101d82832bf\" (UID: \"7726b069-cc88-4bf4-9e9f-f101d82832bf\") " Mar 20 09:06:04 crc kubenswrapper[4903]: I0320 09:06:04.972774 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7726b069-cc88-4bf4-9e9f-f101d82832bf-kube-api-access-4wpbv" (OuterVolumeSpecName: "kube-api-access-4wpbv") pod "7726b069-cc88-4bf4-9e9f-f101d82832bf" (UID: "7726b069-cc88-4bf4-9e9f-f101d82832bf"). InnerVolumeSpecName "kube-api-access-4wpbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:05 crc kubenswrapper[4903]: I0320 09:06:05.064466 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wpbv\" (UniqueName: \"kubernetes.io/projected/7726b069-cc88-4bf4-9e9f-f101d82832bf-kube-api-access-4wpbv\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:05 crc kubenswrapper[4903]: I0320 09:06:05.521280 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566626-srgr2" event={"ID":"7726b069-cc88-4bf4-9e9f-f101d82832bf","Type":"ContainerDied","Data":"683e3aa3921908ee3305d03f79f7c71bd25a808e327ad48aa425cfa626bc1252"} Mar 20 09:06:05 crc kubenswrapper[4903]: I0320 09:06:05.521329 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="683e3aa3921908ee3305d03f79f7c71bd25a808e327ad48aa425cfa626bc1252" Mar 20 09:06:05 crc kubenswrapper[4903]: I0320 09:06:05.521364 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-srgr2" Mar 20 09:06:05 crc kubenswrapper[4903]: I0320 09:06:05.584835 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566620-l94tv"] Mar 20 09:06:05 crc kubenswrapper[4903]: I0320 09:06:05.591918 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566620-l94tv"] Mar 20 09:06:07 crc kubenswrapper[4903]: I0320 09:06:07.505586 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c92018-7fab-4cfa-a34f-9f6fb2dce3f5" path="/var/lib/kubelet/pods/72c92018-7fab-4cfa-a34f-9f6fb2dce3f5/volumes" Mar 20 09:06:10 crc kubenswrapper[4903]: I0320 09:06:10.490965 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:06:10 crc kubenswrapper[4903]: E0320 09:06:10.491415 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:06:23 crc kubenswrapper[4903]: I0320 09:06:23.490905 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:06:23 crc kubenswrapper[4903]: E0320 09:06:23.491493 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:06:38 crc kubenswrapper[4903]: I0320 09:06:38.490461 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:06:38 crc kubenswrapper[4903]: E0320 09:06:38.491223 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:06:50 crc kubenswrapper[4903]: I0320 09:06:50.495897 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:06:50 crc kubenswrapper[4903]: E0320 09:06:50.498751 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.178205 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-phxhv"] Mar 20 09:06:51 crc kubenswrapper[4903]: E0320 09:06:51.178595 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7726b069-cc88-4bf4-9e9f-f101d82832bf" containerName="oc" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.178614 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7726b069-cc88-4bf4-9e9f-f101d82832bf" containerName="oc" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.178775 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7726b069-cc88-4bf4-9e9f-f101d82832bf" containerName="oc" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.179909 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.185344 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phxhv"] Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.217688 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-catalog-content\") pod \"certified-operators-phxhv\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.217750 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-utilities\") pod \"certified-operators-phxhv\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.217818 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxtwp\" (UniqueName: \"kubernetes.io/projected/6780e0f4-0393-40ce-889d-5d84a04d5983-kube-api-access-pxtwp\") pod \"certified-operators-phxhv\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.319205 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxtwp\" (UniqueName: \"kubernetes.io/projected/6780e0f4-0393-40ce-889d-5d84a04d5983-kube-api-access-pxtwp\") pod \"certified-operators-phxhv\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.319301 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-catalog-content\") pod \"certified-operators-phxhv\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.319322 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-utilities\") pod \"certified-operators-phxhv\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.319742 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-utilities\") pod \"certified-operators-phxhv\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.319878 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-catalog-content\") pod \"certified-operators-phxhv\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.348984 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxtwp\" (UniqueName: \"kubernetes.io/projected/6780e0f4-0393-40ce-889d-5d84a04d5983-kube-api-access-pxtwp\") pod \"certified-operators-phxhv\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:51 crc kubenswrapper[4903]: I0320 09:06:51.548025 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:06:52 crc kubenswrapper[4903]: I0320 09:06:52.052135 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-phxhv"] Mar 20 09:06:52 crc kubenswrapper[4903]: I0320 09:06:52.880530 4903 generic.go:334] "Generic (PLEG): container finished" podID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerID="524ff609eec8ef8520c0d863377af85144270b7438ca733db7df1b61fee022a2" exitCode=0 Mar 20 09:06:52 crc kubenswrapper[4903]: I0320 09:06:52.880670 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxhv" event={"ID":"6780e0f4-0393-40ce-889d-5d84a04d5983","Type":"ContainerDied","Data":"524ff609eec8ef8520c0d863377af85144270b7438ca733db7df1b61fee022a2"} Mar 20 09:06:52 crc kubenswrapper[4903]: I0320 09:06:52.880958 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxhv" event={"ID":"6780e0f4-0393-40ce-889d-5d84a04d5983","Type":"ContainerStarted","Data":"5f61b286b99b23727eb6d4db73433eba7e945fd1b0bcb32ee44e7e0a599c2ea0"} Mar 20 09:06:53 crc kubenswrapper[4903]: I0320 09:06:53.891544 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxhv" event={"ID":"6780e0f4-0393-40ce-889d-5d84a04d5983","Type":"ContainerStarted","Data":"f1a896c6a487fbae0713cdeaa9c37bc96f078fa51027adea157c8688abe5f02c"} Mar 20 09:06:54 crc kubenswrapper[4903]: I0320 09:06:54.909227 4903 generic.go:334] "Generic (PLEG): container finished" podID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerID="f1a896c6a487fbae0713cdeaa9c37bc96f078fa51027adea157c8688abe5f02c" exitCode=0 Mar 20 09:06:54 crc kubenswrapper[4903]: I0320 09:06:54.909288 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxhv" event={"ID":"6780e0f4-0393-40ce-889d-5d84a04d5983","Type":"ContainerDied","Data":"f1a896c6a487fbae0713cdeaa9c37bc96f078fa51027adea157c8688abe5f02c"} Mar 20 09:06:55 crc kubenswrapper[4903]: I0320 09:06:55.917422 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxhv" event={"ID":"6780e0f4-0393-40ce-889d-5d84a04d5983","Type":"ContainerStarted","Data":"83c7374ce631d29fd322836d117b37a87cadcd2ae20cc4c7ef419968ce2810ce"} Mar 20 09:06:55 crc kubenswrapper[4903]: I0320 09:06:55.937677 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-phxhv" podStartSLOduration=2.466551182 podStartE2EDuration="4.937653304s" podCreationTimestamp="2026-03-20 09:06:51 +0000 UTC" firstStartedPulling="2026-03-20 09:06:52.882439676 +0000 UTC m=+2638.099339981" lastFinishedPulling="2026-03-20 09:06:55.353541778 +0000 UTC m=+2640.570442103" observedRunningTime="2026-03-20 09:06:55.933073434 +0000 UTC m=+2641.149973759" watchObservedRunningTime="2026-03-20 09:06:55.937653304 +0000 UTC m=+2641.154553619" Mar 20 09:07:01 crc kubenswrapper[4903]: I0320 09:07:01.548321 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:07:01 crc kubenswrapper[4903]: I0320 09:07:01.549099 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:07:01 crc kubenswrapper[4903]: I0320 09:07:01.602011 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:07:02 crc kubenswrapper[4903]: I0320 09:07:02.005713 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:07:02 crc kubenswrapper[4903]: I0320 09:07:02.052299 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phxhv"] Mar 20 09:07:02 crc kubenswrapper[4903]: I0320 09:07:02.490517 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:07:02 crc kubenswrapper[4903]: E0320 09:07:02.490804 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:07:02 crc kubenswrapper[4903]: I0320 09:07:02.491223 4903 scope.go:117] "RemoveContainer" containerID="0ec0855e7e6e5912b10786185a650793736a63be48953dfc0702ae2a2e65d8d2" Mar 20 09:07:03 crc kubenswrapper[4903]: I0320 09:07:03.976688 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-phxhv" podUID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerName="registry-server" containerID="cri-o://83c7374ce631d29fd322836d117b37a87cadcd2ae20cc4c7ef419968ce2810ce" gracePeriod=2 Mar 20 09:07:04 crc kubenswrapper[4903]: I0320 09:07:04.986350 4903 generic.go:334] "Generic (PLEG): container finished" podID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerID="83c7374ce631d29fd322836d117b37a87cadcd2ae20cc4c7ef419968ce2810ce" exitCode=0 Mar 20 09:07:04 crc kubenswrapper[4903]: I0320 09:07:04.986393 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxhv" event={"ID":"6780e0f4-0393-40ce-889d-5d84a04d5983","Type":"ContainerDied","Data":"83c7374ce631d29fd322836d117b37a87cadcd2ae20cc4c7ef419968ce2810ce"} Mar 20 09:07:05 crc kubenswrapper[4903]: I0320 09:07:05.204986 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:07:05 crc kubenswrapper[4903]: I0320 09:07:05.342068 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxtwp\" (UniqueName: \"kubernetes.io/projected/6780e0f4-0393-40ce-889d-5d84a04d5983-kube-api-access-pxtwp\") pod \"6780e0f4-0393-40ce-889d-5d84a04d5983\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " Mar 20 09:07:05 crc kubenswrapper[4903]: I0320 09:07:05.342192 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-utilities\") pod \"6780e0f4-0393-40ce-889d-5d84a04d5983\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " Mar 20 09:07:05 crc kubenswrapper[4903]: I0320 09:07:05.342309 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-catalog-content\") pod \"6780e0f4-0393-40ce-889d-5d84a04d5983\" (UID: \"6780e0f4-0393-40ce-889d-5d84a04d5983\") " Mar 20 09:07:05 crc kubenswrapper[4903]: I0320 09:07:05.343154 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-utilities" (OuterVolumeSpecName: "utilities") pod "6780e0f4-0393-40ce-889d-5d84a04d5983" (UID: "6780e0f4-0393-40ce-889d-5d84a04d5983"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:05 crc kubenswrapper[4903]: I0320 09:07:05.347242 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6780e0f4-0393-40ce-889d-5d84a04d5983-kube-api-access-pxtwp" (OuterVolumeSpecName: "kube-api-access-pxtwp") pod "6780e0f4-0393-40ce-889d-5d84a04d5983" (UID: "6780e0f4-0393-40ce-889d-5d84a04d5983"). InnerVolumeSpecName "kube-api-access-pxtwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:05 crc kubenswrapper[4903]: I0320 09:07:05.395885 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6780e0f4-0393-40ce-889d-5d84a04d5983" (UID: "6780e0f4-0393-40ce-889d-5d84a04d5983"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:05 crc kubenswrapper[4903]: I0320 09:07:05.444157 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:05 crc kubenswrapper[4903]: I0320 09:07:05.444206 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxtwp\" (UniqueName: \"kubernetes.io/projected/6780e0f4-0393-40ce-889d-5d84a04d5983-kube-api-access-pxtwp\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:05 crc kubenswrapper[4903]: I0320 09:07:05.444228 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6780e0f4-0393-40ce-889d-5d84a04d5983-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:06 crc kubenswrapper[4903]: I0320 09:07:06.001472 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-phxhv" event={"ID":"6780e0f4-0393-40ce-889d-5d84a04d5983","Type":"ContainerDied","Data":"5f61b286b99b23727eb6d4db73433eba7e945fd1b0bcb32ee44e7e0a599c2ea0"} Mar 20 09:07:06 crc kubenswrapper[4903]: I0320 09:07:06.001600 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-phxhv" Mar 20 09:07:06 crc kubenswrapper[4903]: I0320 09:07:06.001859 4903 scope.go:117] "RemoveContainer" containerID="83c7374ce631d29fd322836d117b37a87cadcd2ae20cc4c7ef419968ce2810ce" Mar 20 09:07:06 crc kubenswrapper[4903]: I0320 09:07:06.032336 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-phxhv"] Mar 20 09:07:06 crc kubenswrapper[4903]: I0320 09:07:06.036436 4903 scope.go:117] "RemoveContainer" containerID="f1a896c6a487fbae0713cdeaa9c37bc96f078fa51027adea157c8688abe5f02c" Mar 20 09:07:06 crc kubenswrapper[4903]: I0320 09:07:06.041213 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-phxhv"] Mar 20 09:07:06 crc kubenswrapper[4903]: I0320 09:07:06.062247 4903 scope.go:117] "RemoveContainer" containerID="524ff609eec8ef8520c0d863377af85144270b7438ca733db7df1b61fee022a2" Mar 20 09:07:07 crc kubenswrapper[4903]: I0320 09:07:07.504061 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6780e0f4-0393-40ce-889d-5d84a04d5983" path="/var/lib/kubelet/pods/6780e0f4-0393-40ce-889d-5d84a04d5983/volumes" Mar 20 09:07:13 crc kubenswrapper[4903]: I0320 09:07:13.491279 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:07:13 crc kubenswrapper[4903]: E0320 09:07:13.492125 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:07:25 crc kubenswrapper[4903]: I0320 09:07:25.495254 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:07:25 crc kubenswrapper[4903]: E0320 09:07:25.496125 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:07:36 crc kubenswrapper[4903]: I0320 09:07:36.491294 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:07:36 crc kubenswrapper[4903]: E0320 09:07:36.492007 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:07:50 crc kubenswrapper[4903]: I0320 09:07:50.491249 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:07:50 crc kubenswrapper[4903]: E0320 09:07:50.492215 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.150168 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566628-c8zdd"] Mar 20 09:08:00 crc kubenswrapper[4903]: E0320 09:08:00.151003 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerName="extract-utilities" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.151017 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerName="extract-utilities" Mar 20 09:08:00 crc kubenswrapper[4903]: E0320 09:08:00.151915 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerName="extract-content" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.151930 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerName="extract-content" Mar 20 09:08:00 crc kubenswrapper[4903]: E0320 09:08:00.151957 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerName="registry-server" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.151966 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerName="registry-server" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.152183 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6780e0f4-0393-40ce-889d-5d84a04d5983" containerName="registry-server" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.152724 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-c8zdd" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.154883 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.155307 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.155971 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.172231 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-c8zdd"] Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.244713 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k5lj\" (UniqueName: \"kubernetes.io/projected/1e4d191c-018d-4682-96a4-0b3d308c9381-kube-api-access-6k5lj\") pod \"auto-csr-approver-29566628-c8zdd\" (UID: \"1e4d191c-018d-4682-96a4-0b3d308c9381\") " pod="openshift-infra/auto-csr-approver-29566628-c8zdd" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.346298 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k5lj\" (UniqueName: \"kubernetes.io/projected/1e4d191c-018d-4682-96a4-0b3d308c9381-kube-api-access-6k5lj\") pod \"auto-csr-approver-29566628-c8zdd\" (UID: \"1e4d191c-018d-4682-96a4-0b3d308c9381\") " pod="openshift-infra/auto-csr-approver-29566628-c8zdd" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.374888 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k5lj\" (UniqueName: \"kubernetes.io/projected/1e4d191c-018d-4682-96a4-0b3d308c9381-kube-api-access-6k5lj\") pod \"auto-csr-approver-29566628-c8zdd\" (UID: \"1e4d191c-018d-4682-96a4-0b3d308c9381\") " pod="openshift-infra/auto-csr-approver-29566628-c8zdd" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.475193 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-c8zdd" Mar 20 09:08:00 crc kubenswrapper[4903]: I0320 09:08:00.955808 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-c8zdd"] Mar 20 09:08:01 crc kubenswrapper[4903]: I0320 09:08:01.478402 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566628-c8zdd" event={"ID":"1e4d191c-018d-4682-96a4-0b3d308c9381","Type":"ContainerStarted","Data":"ff5660530681e1fbc0d62fb5e1c260c230daa1784fd6982a88b9c36029a25c85"} Mar 20 09:08:03 crc kubenswrapper[4903]: I0320 09:08:03.491497 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:08:03 crc kubenswrapper[4903]: E0320 09:08:03.492383 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:08:05 crc kubenswrapper[4903]: I0320 09:08:05.512807 4903 generic.go:334] "Generic (PLEG): container finished" podID="1e4d191c-018d-4682-96a4-0b3d308c9381" containerID="c4c94bc6ed94a6a2562ff30fa634fca21d8ee563a312943a99b1d6594eaa28fe" exitCode=0 Mar 20 09:08:05 crc kubenswrapper[4903]: I0320 09:08:05.512955 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566628-c8zdd" event={"ID":"1e4d191c-018d-4682-96a4-0b3d308c9381","Type":"ContainerDied","Data":"c4c94bc6ed94a6a2562ff30fa634fca21d8ee563a312943a99b1d6594eaa28fe"} Mar 20 09:08:06 crc kubenswrapper[4903]: I0320 09:08:06.858523 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-c8zdd" Mar 20 09:08:06 crc kubenswrapper[4903]: I0320 09:08:06.975850 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k5lj\" (UniqueName: \"kubernetes.io/projected/1e4d191c-018d-4682-96a4-0b3d308c9381-kube-api-access-6k5lj\") pod \"1e4d191c-018d-4682-96a4-0b3d308c9381\" (UID: \"1e4d191c-018d-4682-96a4-0b3d308c9381\") " Mar 20 09:08:06 crc kubenswrapper[4903]: I0320 09:08:06.980490 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e4d191c-018d-4682-96a4-0b3d308c9381-kube-api-access-6k5lj" (OuterVolumeSpecName: "kube-api-access-6k5lj") pod "1e4d191c-018d-4682-96a4-0b3d308c9381" (UID: "1e4d191c-018d-4682-96a4-0b3d308c9381"). InnerVolumeSpecName "kube-api-access-6k5lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:07 crc kubenswrapper[4903]: I0320 09:08:07.077994 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k5lj\" (UniqueName: \"kubernetes.io/projected/1e4d191c-018d-4682-96a4-0b3d308c9381-kube-api-access-6k5lj\") on node \"crc\" DevicePath \"\"" Mar 20 09:08:07 crc kubenswrapper[4903]: I0320 09:08:07.532101 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566628-c8zdd" event={"ID":"1e4d191c-018d-4682-96a4-0b3d308c9381","Type":"ContainerDied","Data":"ff5660530681e1fbc0d62fb5e1c260c230daa1784fd6982a88b9c36029a25c85"} Mar 20 09:08:07 crc kubenswrapper[4903]: I0320 09:08:07.532561 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff5660530681e1fbc0d62fb5e1c260c230daa1784fd6982a88b9c36029a25c85" Mar 20 09:08:07 crc kubenswrapper[4903]: I0320 09:08:07.532281 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-c8zdd" Mar 20 09:08:07 crc kubenswrapper[4903]: I0320 09:08:07.925942 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-z9qqt"] Mar 20 09:08:07 crc kubenswrapper[4903]: I0320 09:08:07.931027 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-z9qqt"] Mar 20 09:08:09 crc kubenswrapper[4903]: I0320 09:08:09.501578 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbac0c1-d921-4c1e-b1f6-71a95106d7b2" path="/var/lib/kubelet/pods/7bbac0c1-d921-4c1e-b1f6-71a95106d7b2/volumes" Mar 20 09:08:16 crc kubenswrapper[4903]: I0320 09:08:16.491860 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:08:16 crc kubenswrapper[4903]: E0320 09:08:16.492661 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:08:28 crc kubenswrapper[4903]: I0320 09:08:28.491555 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:08:28 crc kubenswrapper[4903]: E0320 09:08:28.492375 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.576371 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-brh8v"] Mar 20 09:08:31 crc kubenswrapper[4903]: E0320 09:08:31.577078 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e4d191c-018d-4682-96a4-0b3d308c9381" containerName="oc" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.577097 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e4d191c-018d-4682-96a4-0b3d308c9381" containerName="oc" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.577283 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e4d191c-018d-4682-96a4-0b3d308c9381" containerName="oc" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.578479 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.586931 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brh8v"] Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.683535 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs82t\" (UniqueName: \"kubernetes.io/projected/6d437cb8-3777-4972-840a-ea6f11fe6660-kube-api-access-zs82t\") pod \"community-operators-brh8v\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.683602 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-utilities\") pod \"community-operators-brh8v\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.683734 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-catalog-content\") pod \"community-operators-brh8v\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.784891 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs82t\" (UniqueName: \"kubernetes.io/projected/6d437cb8-3777-4972-840a-ea6f11fe6660-kube-api-access-zs82t\") pod \"community-operators-brh8v\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.784963 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-utilities\") pod \"community-operators-brh8v\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.785008 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-catalog-content\") pod \"community-operators-brh8v\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.785681 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-catalog-content\") pod \"community-operators-brh8v\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.786170 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-utilities\") pod \"community-operators-brh8v\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.807448 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs82t\" (UniqueName: \"kubernetes.io/projected/6d437cb8-3777-4972-840a-ea6f11fe6660-kube-api-access-zs82t\") pod \"community-operators-brh8v\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:31 crc kubenswrapper[4903]: I0320 09:08:31.913163 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:32 crc kubenswrapper[4903]: I0320 09:08:32.486022 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brh8v"] Mar 20 09:08:32 crc kubenswrapper[4903]: I0320 09:08:32.720738 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brh8v" event={"ID":"6d437cb8-3777-4972-840a-ea6f11fe6660","Type":"ContainerStarted","Data":"f0c899d0f5ce1c1372f70baed661c8167f65ba9e5522f5627e4a25dcd29cf8fa"} Mar 20 09:08:33 crc kubenswrapper[4903]: I0320 09:08:33.730306 4903 generic.go:334] "Generic (PLEG): container finished" podID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerID="219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded" exitCode=0 Mar 20 09:08:33 crc kubenswrapper[4903]: I0320 09:08:33.730374 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brh8v" event={"ID":"6d437cb8-3777-4972-840a-ea6f11fe6660","Type":"ContainerDied","Data":"219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded"} Mar 20 09:08:35 crc kubenswrapper[4903]: I0320 09:08:35.748613 4903 generic.go:334] "Generic (PLEG): container finished" podID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerID="f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607" exitCode=0 Mar 20 09:08:35 crc kubenswrapper[4903]: I0320 09:08:35.748690 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brh8v" event={"ID":"6d437cb8-3777-4972-840a-ea6f11fe6660","Type":"ContainerDied","Data":"f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607"} Mar 20 09:08:36 crc kubenswrapper[4903]: I0320 09:08:36.770654 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brh8v" event={"ID":"6d437cb8-3777-4972-840a-ea6f11fe6660","Type":"ContainerStarted","Data":"0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a"} Mar 20 09:08:36 crc kubenswrapper[4903]: I0320 09:08:36.792695 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-brh8v" podStartSLOduration=3.254831708 podStartE2EDuration="5.792668995s" podCreationTimestamp="2026-03-20 09:08:31 +0000 UTC" firstStartedPulling="2026-03-20 09:08:33.737705923 +0000 UTC m=+2738.954606238" lastFinishedPulling="2026-03-20 09:08:36.27554321 +0000 UTC m=+2741.492443525" observedRunningTime="2026-03-20 09:08:36.791708541 +0000 UTC m=+2742.008608866" watchObservedRunningTime="2026-03-20 09:08:36.792668995 +0000 UTC m=+2742.009569320" Mar 20 09:08:39 crc kubenswrapper[4903]: I0320 09:08:39.491008 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:08:39 crc kubenswrapper[4903]: E0320 09:08:39.491731 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:08:41 crc kubenswrapper[4903]: I0320 09:08:41.913725 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:41 crc kubenswrapper[4903]: I0320 09:08:41.914054 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:41 crc kubenswrapper[4903]: I0320 09:08:41.963927 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:42 crc kubenswrapper[4903]: I0320 09:08:42.857783 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:42 crc kubenswrapper[4903]: I0320 09:08:42.903433 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brh8v"] Mar 20 09:08:44 crc kubenswrapper[4903]: I0320 09:08:44.826167 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-brh8v" podUID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerName="registry-server" containerID="cri-o://0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a" gracePeriod=2 Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.236758 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.404518 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs82t\" (UniqueName: \"kubernetes.io/projected/6d437cb8-3777-4972-840a-ea6f11fe6660-kube-api-access-zs82t\") pod \"6d437cb8-3777-4972-840a-ea6f11fe6660\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.404619 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-catalog-content\") pod \"6d437cb8-3777-4972-840a-ea6f11fe6660\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.404687 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-utilities\") pod \"6d437cb8-3777-4972-840a-ea6f11fe6660\" (UID: \"6d437cb8-3777-4972-840a-ea6f11fe6660\") " Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.406908 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-utilities" (OuterVolumeSpecName: "utilities") pod "6d437cb8-3777-4972-840a-ea6f11fe6660" (UID: "6d437cb8-3777-4972-840a-ea6f11fe6660"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.410078 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d437cb8-3777-4972-840a-ea6f11fe6660-kube-api-access-zs82t" (OuterVolumeSpecName: "kube-api-access-zs82t") pod "6d437cb8-3777-4972-840a-ea6f11fe6660" (UID: "6d437cb8-3777-4972-840a-ea6f11fe6660"). InnerVolumeSpecName "kube-api-access-zs82t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.489634 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d437cb8-3777-4972-840a-ea6f11fe6660" (UID: "6d437cb8-3777-4972-840a-ea6f11fe6660"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.506080 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs82t\" (UniqueName: \"kubernetes.io/projected/6d437cb8-3777-4972-840a-ea6f11fe6660-kube-api-access-zs82t\") on node \"crc\" DevicePath \"\"" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.506117 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.506130 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d437cb8-3777-4972-840a-ea6f11fe6660-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.835679 4903 generic.go:334] "Generic (PLEG): container finished" podID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerID="0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a" exitCode=0 Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.835724 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brh8v" event={"ID":"6d437cb8-3777-4972-840a-ea6f11fe6660","Type":"ContainerDied","Data":"0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a"} Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.835757 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brh8v" event={"ID":"6d437cb8-3777-4972-840a-ea6f11fe6660","Type":"ContainerDied","Data":"f0c899d0f5ce1c1372f70baed661c8167f65ba9e5522f5627e4a25dcd29cf8fa"} Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.835778 4903 scope.go:117] "RemoveContainer" containerID="0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.835860 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brh8v" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.873150 4903 scope.go:117] "RemoveContainer" containerID="f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.877634 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brh8v"] Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.883364 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-brh8v"] Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.895529 4903 scope.go:117] "RemoveContainer" containerID="219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.922276 4903 scope.go:117] "RemoveContainer" containerID="0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a" Mar 20 09:08:45 crc kubenswrapper[4903]: E0320 09:08:45.922837 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a\": container with ID starting with 0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a not found: ID does not exist" containerID="0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.922881 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a"} err="failed to get container status \"0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a\": rpc error: code = NotFound desc = could not find container \"0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a\": container with ID starting with 0e0ec8eadfc8e04a0e0208f5d93109e1668d27dc3c2af59e1a69cef00efd269a not found: ID does not exist" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.922916 4903 scope.go:117] "RemoveContainer" containerID="f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607" Mar 20 09:08:45 crc kubenswrapper[4903]: E0320 09:08:45.923412 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607\": container with ID starting with f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607 not found: ID does not exist" containerID="f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.923469 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607"} err="failed to get container status \"f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607\": rpc error: code = NotFound desc = could not find container \"f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607\": container with ID starting with f58ac3abd51a337848638bd57b44f8c3b2cf99bb111ccb9d3bfefd86a3859607 not found: ID does not exist" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.923530 4903 scope.go:117] "RemoveContainer" containerID="219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded" Mar 20 09:08:45 crc kubenswrapper[4903]: E0320 09:08:45.923906 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded\": container with ID starting with 219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded not found: ID does not exist" containerID="219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded" Mar 20 09:08:45 crc kubenswrapper[4903]: I0320 09:08:45.923937 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded"} err="failed to get container status \"219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded\": rpc error: code = NotFound desc = could not find container \"219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded\": container with ID starting with 219c4592a88acca9d45d17a5901096ad90e56f2e842530a2953928188e4e1ded not found: ID does not exist" Mar 20 09:08:47 crc kubenswrapper[4903]: I0320 09:08:47.500325 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d437cb8-3777-4972-840a-ea6f11fe6660" path="/var/lib/kubelet/pods/6d437cb8-3777-4972-840a-ea6f11fe6660/volumes" Mar 20 09:08:54 crc kubenswrapper[4903]: I0320 09:08:54.491222 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:08:54 crc kubenswrapper[4903]: E0320 09:08:54.492270 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:09:02 crc kubenswrapper[4903]: I0320 09:09:02.603594 4903 scope.go:117] "RemoveContainer" containerID="b0a7e6581ddd2a0b15f8a575486f6fc52f00a4d4f9d16b9f725624a740a38b0d" Mar 20 09:09:09 crc kubenswrapper[4903]: I0320 09:09:09.491333 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:09:09 crc kubenswrapper[4903]: E0320 09:09:09.494006 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:09:21 crc kubenswrapper[4903]: I0320 09:09:21.491402 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:09:21 crc kubenswrapper[4903]: E0320 09:09:21.492065 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:09:32 crc kubenswrapper[4903]: I0320 09:09:32.491664 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:09:32 crc kubenswrapper[4903]: E0320 09:09:32.492501 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:09:44 crc kubenswrapper[4903]: I0320 09:09:44.492270 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:09:44 crc kubenswrapper[4903]: E0320 09:09:44.493110 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:09:57 crc kubenswrapper[4903]: I0320 09:09:57.491654 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:09:58 crc kubenswrapper[4903]: I0320 09:09:58.447174 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"ab88589d91d0147a568e8b91cd68b845b771b73e7c864440e34828f1fbe98ceb"} Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.148743 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566630-7glnd"] Mar 20 09:10:00 crc kubenswrapper[4903]: E0320 09:10:00.149761 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerName="registry-server" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.149782 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerName="registry-server" Mar 20 09:10:00 crc kubenswrapper[4903]: E0320 09:10:00.149804 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerName="extract-utilities" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.149815 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerName="extract-utilities" Mar 20 09:10:00 crc kubenswrapper[4903]: E0320 09:10:00.149846 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerName="extract-content" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.149858 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerName="extract-content" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.150106 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d437cb8-3777-4972-840a-ea6f11fe6660" containerName="registry-server" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.150659 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-7glnd" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.152850 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.154908 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.158110 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.158490 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-7glnd"] Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.326416 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm2w4\" (UniqueName: \"kubernetes.io/projected/72102004-4421-42c4-8325-cb2e927d45fd-kube-api-access-xm2w4\") pod \"auto-csr-approver-29566630-7glnd\" (UID: \"72102004-4421-42c4-8325-cb2e927d45fd\") " pod="openshift-infra/auto-csr-approver-29566630-7glnd" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.427808 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm2w4\" (UniqueName: \"kubernetes.io/projected/72102004-4421-42c4-8325-cb2e927d45fd-kube-api-access-xm2w4\") pod \"auto-csr-approver-29566630-7glnd\" (UID: \"72102004-4421-42c4-8325-cb2e927d45fd\") " pod="openshift-infra/auto-csr-approver-29566630-7glnd" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.446113 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm2w4\" (UniqueName: \"kubernetes.io/projected/72102004-4421-42c4-8325-cb2e927d45fd-kube-api-access-xm2w4\") pod \"auto-csr-approver-29566630-7glnd\" (UID: \"72102004-4421-42c4-8325-cb2e927d45fd\") " pod="openshift-infra/auto-csr-approver-29566630-7glnd" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.475979 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-7glnd" Mar 20 09:10:00 crc kubenswrapper[4903]: I0320 09:10:00.894446 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-7glnd"] Mar 20 09:10:01 crc kubenswrapper[4903]: I0320 09:10:01.470603 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566630-7glnd" event={"ID":"72102004-4421-42c4-8325-cb2e927d45fd","Type":"ContainerStarted","Data":"3deb59a33b7e8f94dbbb0bf914eaed34e8cb4947e749fa2912a8131c88221663"} Mar 20 09:10:03 crc kubenswrapper[4903]: I0320 09:10:03.515853 4903 generic.go:334] "Generic (PLEG): container finished" podID="72102004-4421-42c4-8325-cb2e927d45fd" containerID="42b0a222e747a5287375a9fcf148b31d4b25740f3bb3ece1812c269d07e24ac6" exitCode=0 Mar 20 09:10:03 crc kubenswrapper[4903]: I0320 09:10:03.519617 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566630-7glnd" event={"ID":"72102004-4421-42c4-8325-cb2e927d45fd","Type":"ContainerDied","Data":"42b0a222e747a5287375a9fcf148b31d4b25740f3bb3ece1812c269d07e24ac6"} Mar 20 09:10:04 crc kubenswrapper[4903]: I0320 09:10:04.798213 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-7glnd" Mar 20 09:10:04 crc kubenswrapper[4903]: I0320 09:10:04.991331 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm2w4\" (UniqueName: \"kubernetes.io/projected/72102004-4421-42c4-8325-cb2e927d45fd-kube-api-access-xm2w4\") pod \"72102004-4421-42c4-8325-cb2e927d45fd\" (UID: \"72102004-4421-42c4-8325-cb2e927d45fd\") " Mar 20 09:10:04 crc kubenswrapper[4903]: I0320 09:10:04.997800 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72102004-4421-42c4-8325-cb2e927d45fd-kube-api-access-xm2w4" (OuterVolumeSpecName: "kube-api-access-xm2w4") pod "72102004-4421-42c4-8325-cb2e927d45fd" (UID: "72102004-4421-42c4-8325-cb2e927d45fd"). InnerVolumeSpecName "kube-api-access-xm2w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:10:05 crc kubenswrapper[4903]: I0320 09:10:05.092784 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm2w4\" (UniqueName: \"kubernetes.io/projected/72102004-4421-42c4-8325-cb2e927d45fd-kube-api-access-xm2w4\") on node \"crc\" DevicePath \"\"" Mar 20 09:10:05 crc kubenswrapper[4903]: I0320 09:10:05.537387 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566630-7glnd" event={"ID":"72102004-4421-42c4-8325-cb2e927d45fd","Type":"ContainerDied","Data":"3deb59a33b7e8f94dbbb0bf914eaed34e8cb4947e749fa2912a8131c88221663"} Mar 20 09:10:05 crc kubenswrapper[4903]: I0320 09:10:05.537434 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-7glnd" Mar 20 09:10:05 crc kubenswrapper[4903]: I0320 09:10:05.537440 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3deb59a33b7e8f94dbbb0bf914eaed34e8cb4947e749fa2912a8131c88221663" Mar 20 09:10:05 crc kubenswrapper[4903]: I0320 09:10:05.874767 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-tdb9h"] Mar 20 09:10:05 crc kubenswrapper[4903]: I0320 09:10:05.881892 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-tdb9h"] Mar 20 09:10:07 crc kubenswrapper[4903]: I0320 09:10:07.500297 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d50331-fa87-4cf2-b383-9789dabee3ab" path="/var/lib/kubelet/pods/f3d50331-fa87-4cf2-b383-9789dabee3ab/volumes" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.679629 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4rxwf/must-gather-tq8xw"] Mar 20 09:10:49 crc kubenswrapper[4903]: E0320 09:10:49.680747 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72102004-4421-42c4-8325-cb2e927d45fd" containerName="oc" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.680770 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="72102004-4421-42c4-8325-cb2e927d45fd" containerName="oc" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.681015 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="72102004-4421-42c4-8325-cb2e927d45fd" containerName="oc" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.682208 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rxwf/must-gather-tq8xw" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.684309 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4rxwf"/"kube-root-ca.crt" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.685942 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4rxwf"/"openshift-service-ca.crt" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.690239 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4rxwf/must-gather-tq8xw"] Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.803939 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttsmz\" (UniqueName: \"kubernetes.io/projected/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-kube-api-access-ttsmz\") pod \"must-gather-tq8xw\" (UID: \"cfe75078-eb5a-474d-8e0c-1911d1fedcf1\") " pod="openshift-must-gather-4rxwf/must-gather-tq8xw" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.804088 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-must-gather-output\") pod \"must-gather-tq8xw\" (UID: \"cfe75078-eb5a-474d-8e0c-1911d1fedcf1\") " pod="openshift-must-gather-4rxwf/must-gather-tq8xw" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.905685 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-must-gather-output\") pod \"must-gather-tq8xw\" (UID: \"cfe75078-eb5a-474d-8e0c-1911d1fedcf1\") " pod="openshift-must-gather-4rxwf/must-gather-tq8xw" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.905809 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttsmz\" (UniqueName: \"kubernetes.io/projected/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-kube-api-access-ttsmz\") pod \"must-gather-tq8xw\" (UID: \"cfe75078-eb5a-474d-8e0c-1911d1fedcf1\") " pod="openshift-must-gather-4rxwf/must-gather-tq8xw" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.906295 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-must-gather-output\") pod \"must-gather-tq8xw\" (UID: \"cfe75078-eb5a-474d-8e0c-1911d1fedcf1\") " pod="openshift-must-gather-4rxwf/must-gather-tq8xw" Mar 20 09:10:49 crc kubenswrapper[4903]: I0320 09:10:49.936764 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttsmz\" (UniqueName: \"kubernetes.io/projected/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-kube-api-access-ttsmz\") pod \"must-gather-tq8xw\" (UID: \"cfe75078-eb5a-474d-8e0c-1911d1fedcf1\") " pod="openshift-must-gather-4rxwf/must-gather-tq8xw" Mar 20 09:10:50 crc kubenswrapper[4903]: I0320 09:10:50.006196 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rxwf/must-gather-tq8xw" Mar 20 09:10:50 crc kubenswrapper[4903]: I0320 09:10:50.436985 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4rxwf/must-gather-tq8xw"] Mar 20 09:10:50 crc kubenswrapper[4903]: I0320 09:10:50.440732 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:10:50 crc kubenswrapper[4903]: I0320 09:10:50.915006 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rxwf/must-gather-tq8xw" event={"ID":"cfe75078-eb5a-474d-8e0c-1911d1fedcf1","Type":"ContainerStarted","Data":"b9d90df6632ef49131caacfd9a03a4bc59e636b4b54fbd842d0a5674ff26ab0a"} Mar 20 09:10:56 crc kubenswrapper[4903]: I0320 09:10:56.963718 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rxwf/must-gather-tq8xw" event={"ID":"cfe75078-eb5a-474d-8e0c-1911d1fedcf1","Type":"ContainerStarted","Data":"15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d"} Mar 20 09:10:56 crc kubenswrapper[4903]: I0320 09:10:56.964176 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rxwf/must-gather-tq8xw" event={"ID":"cfe75078-eb5a-474d-8e0c-1911d1fedcf1","Type":"ContainerStarted","Data":"8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56"} Mar 20 09:10:56 crc kubenswrapper[4903]: I0320 09:10:56.982968 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4rxwf/must-gather-tq8xw" podStartSLOduration=2.013727527 podStartE2EDuration="7.982949085s" podCreationTimestamp="2026-03-20 09:10:49 +0000 UTC" firstStartedPulling="2026-03-20 09:10:50.440667902 +0000 UTC m=+2875.657568237" lastFinishedPulling="2026-03-20 09:10:56.40988948 +0000 UTC m=+2881.626789795" observedRunningTime="2026-03-20 09:10:56.977365399 +0000 UTC m=+2882.194265714" watchObservedRunningTime="2026-03-20 09:10:56.982949085 +0000 UTC m=+2882.199849400" Mar 20 09:11:02 crc kubenswrapper[4903]: I0320 09:11:02.722953 4903 scope.go:117] "RemoveContainer" containerID="d16110c8ecb2e92935b34a56173810487ee5586dec579ac78037ba27b3c62e1f" Mar 20 09:11:56 crc kubenswrapper[4903]: I0320 09:11:56.521696 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-kzflj_ce3b9536-0686-4544-9ddb-c8e197b5d24a/manager/0.log" Mar 20 09:11:56 crc kubenswrapper[4903]: I0320 09:11:56.615710 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl_34f42916-3cdf-413e-92df-7066282621a4/util/0.log" Mar 20 09:11:56 crc kubenswrapper[4903]: I0320 09:11:56.781373 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl_34f42916-3cdf-413e-92df-7066282621a4/util/0.log" Mar 20 09:11:56 crc kubenswrapper[4903]: I0320 09:11:56.788228 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl_34f42916-3cdf-413e-92df-7066282621a4/pull/0.log" Mar 20 09:11:56 crc kubenswrapper[4903]: I0320 09:11:56.823736 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl_34f42916-3cdf-413e-92df-7066282621a4/pull/0.log" Mar 20 09:11:56 crc kubenswrapper[4903]: I0320 09:11:56.980647 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl_34f42916-3cdf-413e-92df-7066282621a4/extract/0.log" Mar 20 09:11:56 crc kubenswrapper[4903]: I0320 09:11:56.989265 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl_34f42916-3cdf-413e-92df-7066282621a4/pull/0.log" Mar 20 09:11:56 crc kubenswrapper[4903]: I0320 09:11:56.990867 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bba83ce9f42a3265d43191818486e54b74f63b211e9887ec74e7259e5c9tncl_34f42916-3cdf-413e-92df-7066282621a4/util/0.log" Mar 20 09:11:57 crc kubenswrapper[4903]: I0320 09:11:57.164134 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-rwzl9_e45265ca-9523-407d-b93a-16fc26817060/manager/0.log" Mar 20 09:11:57 crc kubenswrapper[4903]: I0320 09:11:57.503476 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-f4b4f_41774efc-3c12-43fb-b4a3-023e5e4811f5/manager/0.log" Mar 20 09:11:57 crc kubenswrapper[4903]: I0320 09:11:57.579704 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-cnjpn_dbc3863b-0f31-47c1-af79-58e6387d5a18/manager/0.log" Mar 20 09:11:57 crc kubenswrapper[4903]: I0320 09:11:57.853956 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-z9fqs_c9fa68a8-9b69-40dc-a614-a7d85a9473f8/manager/0.log" Mar 20 09:11:57 crc kubenswrapper[4903]: I0320 09:11:57.885641 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-9pxrs_246db1f4-c0bc-4152-9275-dec8e8ca6233/manager/0.log" Mar 20 09:11:58 crc kubenswrapper[4903]: I0320 09:11:58.086118 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-bbz45_d103b53c-f076-4441-8ff4-c6a3be6ac200/manager/0.log" Mar 20 09:11:58 crc kubenswrapper[4903]: I0320 09:11:58.138994 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-65sn9_c938f9a1-4273-4d7a-91f1-e430e43ef704/manager/0.log" Mar 20 09:11:58 crc kubenswrapper[4903]: I0320 09:11:58.314436 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-pgssm_fbb60c8b-9933-40bd-9a01-3463fa38fd41/manager/0.log" Mar 20 09:11:58 crc kubenswrapper[4903]: I0320 09:11:58.380555 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-7hg7s_52c17e50-83c9-46ae-8804-aba50e3ff916/manager/0.log" Mar 20 09:11:58 crc kubenswrapper[4903]: I0320 09:11:58.526655 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-lhp57_a560c084-9049-431f-94bc-60bd2639b801/manager/0.log" Mar 20 09:11:58 crc kubenswrapper[4903]: I0320 09:11:58.615746 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-gccx7_b5fca5e8-3b5c-49f5-aae9-f13e1fef0111/manager/0.log" Mar 20 09:11:58 crc kubenswrapper[4903]: I0320 09:11:58.801628 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-9sv94_73cea0b7-43b8-491a-b6f3-c8ec8563583f/manager/0.log" Mar 20 09:11:58 crc kubenswrapper[4903]: I0320 09:11:58.816321 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-rgwvj_db9e032d-63e4-44e3-99d6-55c13c900127/manager/0.log" Mar 20 09:11:58 crc kubenswrapper[4903]: I0320 09:11:58.939787 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-bsq62_dee70365-edd2-44fe-b49e-5b0cd67dd6df/manager/0.log" Mar 20 09:11:59 crc kubenswrapper[4903]: I0320 09:11:59.148815 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7f6c6d49c4-j9tg6_15cbed57-b491-4c0b-94d3-cfb6a3c7a624/operator/0.log" Mar 20 09:11:59 crc kubenswrapper[4903]: I0320 09:11:59.381621 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-gwb86_22a4229d-fa0a-4948-ab60-3f2c5d1c72df/registry-server/0.log" Mar 20 09:11:59 crc kubenswrapper[4903]: I0320 09:11:59.773937 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-fp9lr_ae26b736-2e3e-4a77-83e1-7df0a04cd02b/manager/0.log" Mar 20 09:11:59 crc kubenswrapper[4903]: I0320 09:11:59.854356 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-85788d4595-jxnj2_34a31f63-98e3-445f-a11c-92a0fb057a4b/manager/0.log" Mar 20 09:11:59 crc kubenswrapper[4903]: I0320 09:11:59.883257 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-9n5gb_84b169f4-3d59-46de-b955-c8b2de1045f4/manager/0.log" Mar 20 09:11:59 crc kubenswrapper[4903]: I0320 09:11:59.962882 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-6hjsx_fc915c58-af04-4aac-81d9-43d88136f7df/operator/0.log" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.106509 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-fxsc4_c60316c6-cc32-497c-9955-d38de3103fdc/manager/0.log" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.142403 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566632-m9l4z"] Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.143226 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-m9l4z" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.147619 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.147969 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.148405 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.155875 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566632-m9l4z"] Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.297059 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-lntlm_79ef28ec-b069-4ee3-947b-92a5605c8d73/manager/0.log" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.311514 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6k6l\" (UniqueName: \"kubernetes.io/projected/ac7501a5-11ea-42f6-99b2-5730d7ae6935-kube-api-access-q6k6l\") pod \"auto-csr-approver-29566632-m9l4z\" (UID: \"ac7501a5-11ea-42f6-99b2-5730d7ae6935\") " pod="openshift-infra/auto-csr-approver-29566632-m9l4z" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.350274 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-77bp7_8ed79f48-8d64-4133-9a1f-1aad870f1767/manager/0.log" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.413262 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6k6l\" (UniqueName: \"kubernetes.io/projected/ac7501a5-11ea-42f6-99b2-5730d7ae6935-kube-api-access-q6k6l\") pod \"auto-csr-approver-29566632-m9l4z\" (UID: \"ac7501a5-11ea-42f6-99b2-5730d7ae6935\") " pod="openshift-infra/auto-csr-approver-29566632-m9l4z" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.439378 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6k6l\" (UniqueName: \"kubernetes.io/projected/ac7501a5-11ea-42f6-99b2-5730d7ae6935-kube-api-access-q6k6l\") pod \"auto-csr-approver-29566632-m9l4z\" (UID: \"ac7501a5-11ea-42f6-99b2-5730d7ae6935\") " pod="openshift-infra/auto-csr-approver-29566632-m9l4z" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.458726 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-m9l4z" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.519764 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-56pwc_a213486d-f613-4f65-a866-9e6bc349a1a9/manager/0.log" Mar 20 09:12:00 crc kubenswrapper[4903]: I0320 09:12:00.908105 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566632-m9l4z"] Mar 20 09:12:01 crc kubenswrapper[4903]: I0320 09:12:01.434419 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566632-m9l4z" event={"ID":"ac7501a5-11ea-42f6-99b2-5730d7ae6935","Type":"ContainerStarted","Data":"6f33274be37340674cc9caa44313430dc6e051d59bd775136822e4e825912329"} Mar 20 09:12:02 crc kubenswrapper[4903]: I0320 09:12:02.445781 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566632-m9l4z" event={"ID":"ac7501a5-11ea-42f6-99b2-5730d7ae6935","Type":"ContainerStarted","Data":"7b2b8776c3f4fb3320369ece6c09347c151f26cc0aa23676e0f229ce220571bd"} Mar 20 09:12:02 crc kubenswrapper[4903]: I0320 09:12:02.459857 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566632-m9l4z" podStartSLOduration=1.486085569 podStartE2EDuration="2.459836508s" podCreationTimestamp="2026-03-20 09:12:00 +0000 UTC" firstStartedPulling="2026-03-20 09:12:00.916412497 +0000 UTC m=+2946.133312812" lastFinishedPulling="2026-03-20 09:12:01.890163436 +0000 UTC m=+2947.107063751" observedRunningTime="2026-03-20 09:12:02.457902772 +0000 UTC m=+2947.674803087" watchObservedRunningTime="2026-03-20 09:12:02.459836508 +0000 UTC m=+2947.676736823" Mar 20 09:12:03 crc kubenswrapper[4903]: I0320 09:12:03.454292 4903 generic.go:334] "Generic (PLEG): container finished" podID="ac7501a5-11ea-42f6-99b2-5730d7ae6935" containerID="7b2b8776c3f4fb3320369ece6c09347c151f26cc0aa23676e0f229ce220571bd" exitCode=0 Mar 20 09:12:03 crc kubenswrapper[4903]: I0320 09:12:03.454337 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566632-m9l4z" event={"ID":"ac7501a5-11ea-42f6-99b2-5730d7ae6935","Type":"ContainerDied","Data":"7b2b8776c3f4fb3320369ece6c09347c151f26cc0aa23676e0f229ce220571bd"} Mar 20 09:12:04 crc kubenswrapper[4903]: I0320 09:12:04.775283 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-m9l4z" Mar 20 09:12:04 crc kubenswrapper[4903]: I0320 09:12:04.900333 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6k6l\" (UniqueName: \"kubernetes.io/projected/ac7501a5-11ea-42f6-99b2-5730d7ae6935-kube-api-access-q6k6l\") pod \"ac7501a5-11ea-42f6-99b2-5730d7ae6935\" (UID: \"ac7501a5-11ea-42f6-99b2-5730d7ae6935\") " Mar 20 09:12:04 crc kubenswrapper[4903]: I0320 09:12:04.906804 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac7501a5-11ea-42f6-99b2-5730d7ae6935-kube-api-access-q6k6l" (OuterVolumeSpecName: "kube-api-access-q6k6l") pod "ac7501a5-11ea-42f6-99b2-5730d7ae6935" (UID: "ac7501a5-11ea-42f6-99b2-5730d7ae6935"). InnerVolumeSpecName "kube-api-access-q6k6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:12:05 crc kubenswrapper[4903]: I0320 09:12:05.001985 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6k6l\" (UniqueName: \"kubernetes.io/projected/ac7501a5-11ea-42f6-99b2-5730d7ae6935-kube-api-access-q6k6l\") on node \"crc\" DevicePath \"\"" Mar 20 09:12:05 crc kubenswrapper[4903]: I0320 09:12:05.470235 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566632-m9l4z" event={"ID":"ac7501a5-11ea-42f6-99b2-5730d7ae6935","Type":"ContainerDied","Data":"6f33274be37340674cc9caa44313430dc6e051d59bd775136822e4e825912329"} Mar 20 09:12:05 crc kubenswrapper[4903]: I0320 09:12:05.470590 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f33274be37340674cc9caa44313430dc6e051d59bd775136822e4e825912329" Mar 20 09:12:05 crc kubenswrapper[4903]: I0320 09:12:05.470303 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-m9l4z" Mar 20 09:12:05 crc kubenswrapper[4903]: I0320 09:12:05.535393 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-srgr2"] Mar 20 09:12:05 crc kubenswrapper[4903]: I0320 09:12:05.541740 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-srgr2"] Mar 20 09:12:07 crc kubenswrapper[4903]: I0320 09:12:07.500110 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7726b069-cc88-4bf4-9e9f-f101d82832bf" path="/var/lib/kubelet/pods/7726b069-cc88-4bf4-9e9f-f101d82832bf/volumes" Mar 20 09:12:19 crc kubenswrapper[4903]: I0320 09:12:19.952991 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-9zwz2_faa861a0-c46b-46d2-8f2f-6c8e70f403ec/control-plane-machine-set-operator/0.log" Mar 20 09:12:20 crc kubenswrapper[4903]: I0320 09:12:20.078128 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mx9nb_572cb149-e6e4-4d1b-ab27-145239b82d1c/kube-rbac-proxy/0.log" Mar 20 09:12:20 crc kubenswrapper[4903]: I0320 09:12:20.114535 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mx9nb_572cb149-e6e4-4d1b-ab27-145239b82d1c/machine-api-operator/0.log" Mar 20 09:12:20 crc kubenswrapper[4903]: I0320 09:12:20.833690 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:12:20 crc kubenswrapper[4903]: I0320 09:12:20.833768 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:12:32 crc kubenswrapper[4903]: I0320 09:12:32.928476 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-pnf4r_4fd09484-0087-4f85-a1e3-a67036f4cbca/cert-manager-controller/0.log" Mar 20 09:12:33 crc kubenswrapper[4903]: I0320 09:12:33.096856 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-c22w9_f9bcdff8-0f0a-4797-80e0-c3a7893223dd/cert-manager-cainjector/0.log" Mar 20 09:12:33 crc kubenswrapper[4903]: I0320 09:12:33.111643 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-2bhwr_ffbf5e1f-ab1e-47fe-9171-a63130e38dec/cert-manager-webhook/0.log" Mar 20 09:12:45 crc kubenswrapper[4903]: I0320 09:12:45.711179 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-f8dnk_75a183a7-68ce-40f4-b663-8d8845fedb36/nmstate-console-plugin/0.log" Mar 20 09:12:45 crc kubenswrapper[4903]: I0320 09:12:45.833215 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-nmw5s_4eb953be-dfbf-4a30-bed9-90abab5fb73c/nmstate-handler/0.log" Mar 20 09:12:45 crc kubenswrapper[4903]: I0320 09:12:45.899182 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-df94z_82dbfa21-7bc0-4311-bcb5-3f746b288130/kube-rbac-proxy/0.log" Mar 20 09:12:46 crc kubenswrapper[4903]: I0320 09:12:46.009951 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-df94z_82dbfa21-7bc0-4311-bcb5-3f746b288130/nmstate-metrics/0.log" Mar 20 09:12:46 crc kubenswrapper[4903]: I0320 09:12:46.026463 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-h2c5r_737e090b-af32-421c-badd-b1decc1ace3c/nmstate-operator/0.log" Mar 20 09:12:46 crc kubenswrapper[4903]: I0320 09:12:46.195649 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-mxxc9_20f0bdd3-2ffc-46fa-bd7c-ed3644379f08/nmstate-webhook/0.log" Mar 20 09:12:50 crc kubenswrapper[4903]: I0320 09:12:50.833598 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:12:50 crc kubenswrapper[4903]: I0320 09:12:50.833959 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:13:02 crc kubenswrapper[4903]: I0320 09:13:02.791609 4903 scope.go:117] "RemoveContainer" containerID="7bdfefe0f6b0876d8f983c2b3efb6d418b2a7de448d7509c621f602eb3328cbf" Mar 20 09:13:12 crc kubenswrapper[4903]: I0320 09:13:12.556770 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kqcq5_0b93abcf-779f-44af-b566-b66a7052ceef/kube-rbac-proxy/0.log" Mar 20 09:13:12 crc kubenswrapper[4903]: I0320 09:13:12.785957 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-frr-files/0.log" Mar 20 09:13:12 crc kubenswrapper[4903]: I0320 09:13:12.859770 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-kqcq5_0b93abcf-779f-44af-b566-b66a7052ceef/controller/0.log" Mar 20 09:13:12 crc kubenswrapper[4903]: I0320 09:13:12.991979 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-metrics/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.003978 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-reloader/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.016298 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-frr-files/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.038216 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-reloader/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.180042 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-frr-files/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.231912 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-reloader/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.240767 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-metrics/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.251983 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-metrics/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.371221 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-frr-files/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.395846 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-reloader/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.397543 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/cp-metrics/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.431521 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/controller/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.627848 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/kube-rbac-proxy/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.638730 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/frr-metrics/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.654304 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/kube-rbac-proxy-frr/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.796058 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/reloader/0.log" Mar 20 09:13:13 crc kubenswrapper[4903]: I0320 09:13:13.853848 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-k9569_f09db154-41d6-4b8f-a224-727c22b90f78/frr-k8s-webhook-server/0.log" Mar 20 09:13:14 crc kubenswrapper[4903]: I0320 09:13:14.053234 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-64f5dc6cf7-q7h6d_a8ee6e81-8c56-4c20-a9cf-ca2b4eb7650d/manager/0.log" Mar 20 09:13:14 crc kubenswrapper[4903]: I0320 09:13:14.193849 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5f5d9d49d6-jzxvx_a7fb8f81-902b-4e48-b0d3-a7404373c5af/webhook-server/0.log" Mar 20 09:13:14 crc kubenswrapper[4903]: I0320 09:13:14.302362 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jws2v_5ea82220-7527-4132-bd06-3db8c79850d3/kube-rbac-proxy/0.log" Mar 20 09:13:14 crc kubenswrapper[4903]: I0320 09:13:14.794802 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lds62_af40b4fc-a05d-4df1-a47e-0d316a679275/frr/0.log" Mar 20 09:13:14 crc kubenswrapper[4903]: I0320 09:13:14.795021 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jws2v_5ea82220-7527-4132-bd06-3db8c79850d3/speaker/0.log" Mar 20 09:13:20 crc kubenswrapper[4903]: I0320 09:13:20.834224 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:13:20 crc kubenswrapper[4903]: I0320 09:13:20.835152 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:13:20 crc kubenswrapper[4903]: I0320 09:13:20.835245 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 09:13:20 crc kubenswrapper[4903]: I0320 09:13:20.836442 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab88589d91d0147a568e8b91cd68b845b771b73e7c864440e34828f1fbe98ceb"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:13:20 crc kubenswrapper[4903]: I0320 09:13:20.836566 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://ab88589d91d0147a568e8b91cd68b845b771b73e7c864440e34828f1fbe98ceb" gracePeriod=600 Mar 20 09:13:21 crc kubenswrapper[4903]: I0320 09:13:21.004862 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="ab88589d91d0147a568e8b91cd68b845b771b73e7c864440e34828f1fbe98ceb" exitCode=0 Mar 20 09:13:21 crc kubenswrapper[4903]: I0320 09:13:21.004969 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"ab88589d91d0147a568e8b91cd68b845b771b73e7c864440e34828f1fbe98ceb"} Mar 20 09:13:21 crc kubenswrapper[4903]: I0320 09:13:21.005267 4903 scope.go:117] "RemoveContainer" containerID="05d843b29e30c16c9c72dc2383538466652c95246ca0c3fc36a2ea4621a2190e" Mar 20 09:13:22 crc kubenswrapper[4903]: I0320 09:13:22.013854 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerStarted","Data":"545e88d8f513e78763e5a71448fe99f0796ece0e27da70384e801502f7a18375"} Mar 20 09:13:27 crc kubenswrapper[4903]: I0320 09:13:27.854565 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v_b982e00f-fa44-456d-8551-79654a44bfce/util/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.064333 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v_b982e00f-fa44-456d-8551-79654a44bfce/util/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.093736 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v_b982e00f-fa44-456d-8551-79654a44bfce/pull/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.147862 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v_b982e00f-fa44-456d-8551-79654a44bfce/pull/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.293307 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v_b982e00f-fa44-456d-8551-79654a44bfce/util/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.303337 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v_b982e00f-fa44-456d-8551-79654a44bfce/extract/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.303707 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pv47v_b982e00f-fa44-456d-8551-79654a44bfce/pull/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.445401 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr_d285db6b-c23c-4ad7-90e8-833756b96ec5/util/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.681965 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr_d285db6b-c23c-4ad7-90e8-833756b96ec5/util/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.684892 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr_d285db6b-c23c-4ad7-90e8-833756b96ec5/pull/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.726661 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr_d285db6b-c23c-4ad7-90e8-833756b96ec5/pull/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.855732 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr_d285db6b-c23c-4ad7-90e8-833756b96ec5/util/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.885611 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr_d285db6b-c23c-4ad7-90e8-833756b96ec5/extract/0.log" Mar 20 09:13:28 crc kubenswrapper[4903]: I0320 09:13:28.895051 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1shdbr_d285db6b-c23c-4ad7-90e8-833756b96ec5/pull/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.078529 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc_00f81c35-6107-4e09-982a-ad82eef8735b/util/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.213051 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc_00f81c35-6107-4e09-982a-ad82eef8735b/util/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.276005 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc_00f81c35-6107-4e09-982a-ad82eef8735b/pull/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.276396 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc_00f81c35-6107-4e09-982a-ad82eef8735b/pull/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.447810 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc_00f81c35-6107-4e09-982a-ad82eef8735b/pull/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.471772 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc_00f81c35-6107-4e09-982a-ad82eef8735b/extract/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.473256 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rk9fc_00f81c35-6107-4e09-982a-ad82eef8735b/util/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.848893 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bwc4v_275937e3-335c-40d7-83b3-1e8ddf7d5c2d/extract-utilities/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.938158 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bwc4v_275937e3-335c-40d7-83b3-1e8ddf7d5c2d/extract-content/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.962330 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bwc4v_275937e3-335c-40d7-83b3-1e8ddf7d5c2d/extract-utilities/0.log" Mar 20 09:13:29 crc kubenswrapper[4903]: I0320 09:13:29.983599 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bwc4v_275937e3-335c-40d7-83b3-1e8ddf7d5c2d/extract-content/0.log" Mar 20 09:13:30 crc kubenswrapper[4903]: I0320 09:13:30.099529 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bwc4v_275937e3-335c-40d7-83b3-1e8ddf7d5c2d/extract-content/0.log" Mar 20 09:13:30 crc kubenswrapper[4903]: I0320 09:13:30.112421 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bwc4v_275937e3-335c-40d7-83b3-1e8ddf7d5c2d/extract-utilities/0.log" Mar 20 09:13:30 crc kubenswrapper[4903]: I0320 09:13:30.338023 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6lpf9_e75c6b5b-3812-4cd2-b473-b8e8bdf823f1/extract-utilities/0.log" Mar 20 09:13:30 crc kubenswrapper[4903]: I0320 09:13:30.493675 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6lpf9_e75c6b5b-3812-4cd2-b473-b8e8bdf823f1/extract-content/0.log" Mar 20 09:13:30 crc kubenswrapper[4903]: I0320 09:13:30.553883 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6lpf9_e75c6b5b-3812-4cd2-b473-b8e8bdf823f1/extract-content/0.log" Mar 20 09:13:30 crc kubenswrapper[4903]: I0320 09:13:30.554319 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bwc4v_275937e3-335c-40d7-83b3-1e8ddf7d5c2d/registry-server/0.log" Mar 20 09:13:30 crc kubenswrapper[4903]: I0320 09:13:30.611051 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6lpf9_e75c6b5b-3812-4cd2-b473-b8e8bdf823f1/extract-utilities/0.log" Mar 20 09:13:30 crc kubenswrapper[4903]: I0320 09:13:30.752273 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6lpf9_e75c6b5b-3812-4cd2-b473-b8e8bdf823f1/extract-utilities/0.log" Mar 20 09:13:30 crc kubenswrapper[4903]: I0320 09:13:30.754404 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6lpf9_e75c6b5b-3812-4cd2-b473-b8e8bdf823f1/extract-content/0.log" Mar 20 09:13:30 crc kubenswrapper[4903]: I0320 09:13:30.965174 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6lpf9_e75c6b5b-3812-4cd2-b473-b8e8bdf823f1/registry-server/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.025592 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4mktd_60e4db67-4f24-43be-a77c-bbf913fa9f4a/marketplace-operator/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.060789 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8hxr_9d874d53-f61f-48ae-96d8-dfab83476392/extract-utilities/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.241617 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8hxr_9d874d53-f61f-48ae-96d8-dfab83476392/extract-content/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.242898 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8hxr_9d874d53-f61f-48ae-96d8-dfab83476392/extract-content/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.243183 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8hxr_9d874d53-f61f-48ae-96d8-dfab83476392/extract-utilities/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.485678 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8hxr_9d874d53-f61f-48ae-96d8-dfab83476392/extract-content/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.505936 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8hxr_9d874d53-f61f-48ae-96d8-dfab83476392/extract-utilities/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.587960 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-m8hxr_9d874d53-f61f-48ae-96d8-dfab83476392/registry-server/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.602477 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhxhb_0135d9df-1d61-42f8-9efe-0eb2c81e5a23/extract-utilities/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.789807 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhxhb_0135d9df-1d61-42f8-9efe-0eb2c81e5a23/extract-utilities/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.794918 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhxhb_0135d9df-1d61-42f8-9efe-0eb2c81e5a23/extract-content/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.845795 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhxhb_0135d9df-1d61-42f8-9efe-0eb2c81e5a23/extract-content/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.955890 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhxhb_0135d9df-1d61-42f8-9efe-0eb2c81e5a23/extract-content/0.log" Mar 20 09:13:31 crc kubenswrapper[4903]: I0320 09:13:31.983447 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhxhb_0135d9df-1d61-42f8-9efe-0eb2c81e5a23/extract-utilities/0.log" Mar 20 09:13:32 crc kubenswrapper[4903]: I0320 09:13:32.304936 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhxhb_0135d9df-1d61-42f8-9efe-0eb2c81e5a23/registry-server/0.log" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.159835 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566634-phkds"] Mar 20 09:14:00 crc kubenswrapper[4903]: E0320 09:14:00.166511 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac7501a5-11ea-42f6-99b2-5730d7ae6935" containerName="oc" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.166523 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac7501a5-11ea-42f6-99b2-5730d7ae6935" containerName="oc" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.166669 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac7501a5-11ea-42f6-99b2-5730d7ae6935" containerName="oc" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.167001 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566634-phkds"] Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.167084 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-phkds" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.169762 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.170006 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.170865 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.235781 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkbg9\" (UniqueName: \"kubernetes.io/projected/7a8e8b94-2ebe-42b9-b0a6-49a2af403062-kube-api-access-nkbg9\") pod \"auto-csr-approver-29566634-phkds\" (UID: \"7a8e8b94-2ebe-42b9-b0a6-49a2af403062\") " pod="openshift-infra/auto-csr-approver-29566634-phkds" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.336902 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkbg9\" (UniqueName: \"kubernetes.io/projected/7a8e8b94-2ebe-42b9-b0a6-49a2af403062-kube-api-access-nkbg9\") pod \"auto-csr-approver-29566634-phkds\" (UID: \"7a8e8b94-2ebe-42b9-b0a6-49a2af403062\") " pod="openshift-infra/auto-csr-approver-29566634-phkds" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.363942 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkbg9\" (UniqueName: \"kubernetes.io/projected/7a8e8b94-2ebe-42b9-b0a6-49a2af403062-kube-api-access-nkbg9\") pod \"auto-csr-approver-29566634-phkds\" (UID: \"7a8e8b94-2ebe-42b9-b0a6-49a2af403062\") " pod="openshift-infra/auto-csr-approver-29566634-phkds" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.493499 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-phkds" Mar 20 09:14:00 crc kubenswrapper[4903]: I0320 09:14:00.962897 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566634-phkds"] Mar 20 09:14:01 crc kubenswrapper[4903]: I0320 09:14:01.321700 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566634-phkds" event={"ID":"7a8e8b94-2ebe-42b9-b0a6-49a2af403062","Type":"ContainerStarted","Data":"2f8358ef8a1e56c0b210c63793840d2e7d2c8a4f1c00ae4c8709ab028ad54e93"} Mar 20 09:14:02 crc kubenswrapper[4903]: I0320 09:14:02.333077 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566634-phkds" event={"ID":"7a8e8b94-2ebe-42b9-b0a6-49a2af403062","Type":"ContainerStarted","Data":"1855c9f64a5264a5de4a968f7c4e9690aa231e23dd87120b09403cccb5dc672b"} Mar 20 09:14:02 crc kubenswrapper[4903]: I0320 09:14:02.358917 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566634-phkds" podStartSLOduration=1.470506291 podStartE2EDuration="2.358892277s" podCreationTimestamp="2026-03-20 09:14:00 +0000 UTC" firstStartedPulling="2026-03-20 09:14:00.95631505 +0000 UTC m=+3066.173215405" lastFinishedPulling="2026-03-20 09:14:01.844701066 +0000 UTC m=+3067.061601391" observedRunningTime="2026-03-20 09:14:02.34830531 +0000 UTC m=+3067.565205625" watchObservedRunningTime="2026-03-20 09:14:02.358892277 +0000 UTC m=+3067.575792632" Mar 20 09:14:03 crc kubenswrapper[4903]: I0320 09:14:03.364888 4903 generic.go:334] "Generic (PLEG): container finished" podID="7a8e8b94-2ebe-42b9-b0a6-49a2af403062" containerID="1855c9f64a5264a5de4a968f7c4e9690aa231e23dd87120b09403cccb5dc672b" exitCode=0 Mar 20 09:14:03 crc kubenswrapper[4903]: I0320 09:14:03.364961 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566634-phkds" event={"ID":"7a8e8b94-2ebe-42b9-b0a6-49a2af403062","Type":"ContainerDied","Data":"1855c9f64a5264a5de4a968f7c4e9690aa231e23dd87120b09403cccb5dc672b"} Mar 20 09:14:04 crc kubenswrapper[4903]: I0320 09:14:04.692332 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-phkds" Mar 20 09:14:04 crc kubenswrapper[4903]: I0320 09:14:04.821462 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkbg9\" (UniqueName: \"kubernetes.io/projected/7a8e8b94-2ebe-42b9-b0a6-49a2af403062-kube-api-access-nkbg9\") pod \"7a8e8b94-2ebe-42b9-b0a6-49a2af403062\" (UID: \"7a8e8b94-2ebe-42b9-b0a6-49a2af403062\") " Mar 20 09:14:04 crc kubenswrapper[4903]: I0320 09:14:04.830136 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8e8b94-2ebe-42b9-b0a6-49a2af403062-kube-api-access-nkbg9" (OuterVolumeSpecName: "kube-api-access-nkbg9") pod "7a8e8b94-2ebe-42b9-b0a6-49a2af403062" (UID: "7a8e8b94-2ebe-42b9-b0a6-49a2af403062"). InnerVolumeSpecName "kube-api-access-nkbg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:14:04 crc kubenswrapper[4903]: I0320 09:14:04.923694 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkbg9\" (UniqueName: \"kubernetes.io/projected/7a8e8b94-2ebe-42b9-b0a6-49a2af403062-kube-api-access-nkbg9\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:05 crc kubenswrapper[4903]: I0320 09:14:05.383007 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566634-phkds" event={"ID":"7a8e8b94-2ebe-42b9-b0a6-49a2af403062","Type":"ContainerDied","Data":"2f8358ef8a1e56c0b210c63793840d2e7d2c8a4f1c00ae4c8709ab028ad54e93"} Mar 20 09:14:05 crc kubenswrapper[4903]: I0320 09:14:05.383103 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f8358ef8a1e56c0b210c63793840d2e7d2c8a4f1c00ae4c8709ab028ad54e93" Mar 20 09:14:05 crc kubenswrapper[4903]: I0320 09:14:05.383230 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-phkds" Mar 20 09:14:05 crc kubenswrapper[4903]: I0320 09:14:05.429396 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-c8zdd"] Mar 20 09:14:05 crc kubenswrapper[4903]: I0320 09:14:05.434606 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-c8zdd"] Mar 20 09:14:05 crc kubenswrapper[4903]: I0320 09:14:05.499429 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e4d191c-018d-4682-96a4-0b3d308c9381" path="/var/lib/kubelet/pods/1e4d191c-018d-4682-96a4-0b3d308c9381/volumes" Mar 20 09:14:45 crc kubenswrapper[4903]: I0320 09:14:45.712134 4903 generic.go:334] "Generic (PLEG): container finished" podID="cfe75078-eb5a-474d-8e0c-1911d1fedcf1" containerID="8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56" exitCode=0 Mar 20 09:14:45 crc kubenswrapper[4903]: I0320 09:14:45.712231 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rxwf/must-gather-tq8xw" event={"ID":"cfe75078-eb5a-474d-8e0c-1911d1fedcf1","Type":"ContainerDied","Data":"8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56"} Mar 20 09:14:45 crc kubenswrapper[4903]: I0320 09:14:45.713468 4903 scope.go:117] "RemoveContainer" containerID="8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56" Mar 20 09:14:45 crc kubenswrapper[4903]: I0320 09:14:45.920106 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4rxwf_must-gather-tq8xw_cfe75078-eb5a-474d-8e0c-1911d1fedcf1/gather/0.log" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.150622 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4rxwf/must-gather-tq8xw"] Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.151434 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4rxwf/must-gather-tq8xw" podUID="cfe75078-eb5a-474d-8e0c-1911d1fedcf1" containerName="copy" containerID="cri-o://15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d" gracePeriod=2 Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.157761 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4rxwf/must-gather-tq8xw"] Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.534228 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4rxwf_must-gather-tq8xw_cfe75078-eb5a-474d-8e0c-1911d1fedcf1/copy/0.log" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.534996 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rxwf/must-gather-tq8xw" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.607835 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-must-gather-output\") pod \"cfe75078-eb5a-474d-8e0c-1911d1fedcf1\" (UID: \"cfe75078-eb5a-474d-8e0c-1911d1fedcf1\") " Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.607954 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttsmz\" (UniqueName: \"kubernetes.io/projected/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-kube-api-access-ttsmz\") pod \"cfe75078-eb5a-474d-8e0c-1911d1fedcf1\" (UID: \"cfe75078-eb5a-474d-8e0c-1911d1fedcf1\") " Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.620261 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-kube-api-access-ttsmz" (OuterVolumeSpecName: "kube-api-access-ttsmz") pod "cfe75078-eb5a-474d-8e0c-1911d1fedcf1" (UID: "cfe75078-eb5a-474d-8e0c-1911d1fedcf1"). InnerVolumeSpecName "kube-api-access-ttsmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.709828 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttsmz\" (UniqueName: \"kubernetes.io/projected/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-kube-api-access-ttsmz\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.712308 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "cfe75078-eb5a-474d-8e0c-1911d1fedcf1" (UID: "cfe75078-eb5a-474d-8e0c-1911d1fedcf1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.761293 4903 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4rxwf_must-gather-tq8xw_cfe75078-eb5a-474d-8e0c-1911d1fedcf1/copy/0.log" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.761583 4903 generic.go:334] "Generic (PLEG): container finished" podID="cfe75078-eb5a-474d-8e0c-1911d1fedcf1" containerID="15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d" exitCode=143 Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.761644 4903 scope.go:117] "RemoveContainer" containerID="15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.761644 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rxwf/must-gather-tq8xw" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.781530 4903 scope.go:117] "RemoveContainer" containerID="8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.811094 4903 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cfe75078-eb5a-474d-8e0c-1911d1fedcf1-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.843467 4903 scope.go:117] "RemoveContainer" containerID="15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d" Mar 20 09:14:53 crc kubenswrapper[4903]: E0320 09:14:53.843937 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d\": container with ID starting with 15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d not found: ID does not exist" containerID="15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.844008 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d"} err="failed to get container status \"15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d\": rpc error: code = NotFound desc = could not find container \"15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d\": container with ID starting with 15feaccfbe6e557bbb66da93c2a6d2ee0fd2a0ab67ae9a4a9f0970437462310d not found: ID does not exist" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.844132 4903 scope.go:117] "RemoveContainer" containerID="8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56" Mar 20 09:14:53 crc kubenswrapper[4903]: E0320 09:14:53.844464 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56\": container with ID starting with 8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56 not found: ID does not exist" containerID="8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56" Mar 20 09:14:53 crc kubenswrapper[4903]: I0320 09:14:53.844498 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56"} err="failed to get container status \"8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56\": rpc error: code = NotFound desc = could not find container \"8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56\": container with ID starting with 8f58ace3463cd3ffc399526ac12d05eb1dd065c642bda87e7fd2127880f66d56 not found: ID does not exist" Mar 20 09:14:55 crc kubenswrapper[4903]: I0320 09:14:55.505196 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe75078-eb5a-474d-8e0c-1911d1fedcf1" path="/var/lib/kubelet/pods/cfe75078-eb5a-474d-8e0c-1911d1fedcf1/volumes" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.143263 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm"] Mar 20 09:15:00 crc kubenswrapper[4903]: E0320 09:15:00.144010 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe75078-eb5a-474d-8e0c-1911d1fedcf1" containerName="gather" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.144023 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe75078-eb5a-474d-8e0c-1911d1fedcf1" containerName="gather" Mar 20 09:15:00 crc kubenswrapper[4903]: E0320 09:15:00.144080 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8e8b94-2ebe-42b9-b0a6-49a2af403062" containerName="oc" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.144087 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8e8b94-2ebe-42b9-b0a6-49a2af403062" containerName="oc" Mar 20 09:15:00 crc kubenswrapper[4903]: E0320 09:15:00.144099 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe75078-eb5a-474d-8e0c-1911d1fedcf1" containerName="copy" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.144106 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe75078-eb5a-474d-8e0c-1911d1fedcf1" containerName="copy" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.144222 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe75078-eb5a-474d-8e0c-1911d1fedcf1" containerName="gather" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.144236 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8e8b94-2ebe-42b9-b0a6-49a2af403062" containerName="oc" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.144245 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe75078-eb5a-474d-8e0c-1911d1fedcf1" containerName="copy" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.144688 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.146995 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.147278 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.157230 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm"] Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.201675 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzvt\" (UniqueName: \"kubernetes.io/projected/1deac451-e2e8-415b-8c88-6c28e7922bd8-kube-api-access-gvzvt\") pod \"collect-profiles-29566635-v9wmm\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.201743 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1deac451-e2e8-415b-8c88-6c28e7922bd8-secret-volume\") pod \"collect-profiles-29566635-v9wmm\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.201859 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1deac451-e2e8-415b-8c88-6c28e7922bd8-config-volume\") pod \"collect-profiles-29566635-v9wmm\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.303460 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzvt\" (UniqueName: \"kubernetes.io/projected/1deac451-e2e8-415b-8c88-6c28e7922bd8-kube-api-access-gvzvt\") pod \"collect-profiles-29566635-v9wmm\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.303540 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1deac451-e2e8-415b-8c88-6c28e7922bd8-secret-volume\") pod \"collect-profiles-29566635-v9wmm\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.303603 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1deac451-e2e8-415b-8c88-6c28e7922bd8-config-volume\") pod \"collect-profiles-29566635-v9wmm\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.305289 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1deac451-e2e8-415b-8c88-6c28e7922bd8-config-volume\") pod \"collect-profiles-29566635-v9wmm\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.321109 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1deac451-e2e8-415b-8c88-6c28e7922bd8-secret-volume\") pod \"collect-profiles-29566635-v9wmm\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.334396 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzvt\" (UniqueName: \"kubernetes.io/projected/1deac451-e2e8-415b-8c88-6c28e7922bd8-kube-api-access-gvzvt\") pod \"collect-profiles-29566635-v9wmm\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.462396 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:00 crc kubenswrapper[4903]: I0320 09:15:00.898260 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm"] Mar 20 09:15:01 crc kubenswrapper[4903]: I0320 09:15:01.833347 4903 generic.go:334] "Generic (PLEG): container finished" podID="1deac451-e2e8-415b-8c88-6c28e7922bd8" containerID="0ca56767d9c39b5508d75edca4a4d9b00147d507e1f753a526845f74c3658e16" exitCode=0 Mar 20 09:15:01 crc kubenswrapper[4903]: I0320 09:15:01.833397 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" event={"ID":"1deac451-e2e8-415b-8c88-6c28e7922bd8","Type":"ContainerDied","Data":"0ca56767d9c39b5508d75edca4a4d9b00147d507e1f753a526845f74c3658e16"} Mar 20 09:15:01 crc kubenswrapper[4903]: I0320 09:15:01.833644 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" event={"ID":"1deac451-e2e8-415b-8c88-6c28e7922bd8","Type":"ContainerStarted","Data":"38ae7426915d54b468616c2fec3519de0a0d488b897251f109e868f23530bee0"} Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.568210 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t8h46"] Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.569961 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.588693 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8h46"] Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.648212 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-utilities\") pod \"redhat-marketplace-t8h46\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.648421 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szqg8\" (UniqueName: \"kubernetes.io/projected/f31b1c28-b1bd-47c5-8911-db9b91611f3c-kube-api-access-szqg8\") pod \"redhat-marketplace-t8h46\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.648480 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-catalog-content\") pod \"redhat-marketplace-t8h46\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.749929 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-utilities\") pod \"redhat-marketplace-t8h46\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.750010 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szqg8\" (UniqueName: \"kubernetes.io/projected/f31b1c28-b1bd-47c5-8911-db9b91611f3c-kube-api-access-szqg8\") pod \"redhat-marketplace-t8h46\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.750054 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-catalog-content\") pod \"redhat-marketplace-t8h46\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.750522 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-catalog-content\") pod \"redhat-marketplace-t8h46\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.750806 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-utilities\") pod \"redhat-marketplace-t8h46\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.772556 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szqg8\" (UniqueName: \"kubernetes.io/projected/f31b1c28-b1bd-47c5-8911-db9b91611f3c-kube-api-access-szqg8\") pod \"redhat-marketplace-t8h46\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.871394 4903 scope.go:117] "RemoveContainer" containerID="c4c94bc6ed94a6a2562ff30fa634fca21d8ee563a312943a99b1d6594eaa28fe" Mar 20 09:15:02 crc kubenswrapper[4903]: I0320 09:15:02.890599 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.198650 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.261210 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1deac451-e2e8-415b-8c88-6c28e7922bd8-secret-volume\") pod \"1deac451-e2e8-415b-8c88-6c28e7922bd8\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.261250 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvzvt\" (UniqueName: \"kubernetes.io/projected/1deac451-e2e8-415b-8c88-6c28e7922bd8-kube-api-access-gvzvt\") pod \"1deac451-e2e8-415b-8c88-6c28e7922bd8\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.261337 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1deac451-e2e8-415b-8c88-6c28e7922bd8-config-volume\") pod \"1deac451-e2e8-415b-8c88-6c28e7922bd8\" (UID: \"1deac451-e2e8-415b-8c88-6c28e7922bd8\") " Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.262084 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1deac451-e2e8-415b-8c88-6c28e7922bd8-config-volume" (OuterVolumeSpecName: "config-volume") pod "1deac451-e2e8-415b-8c88-6c28e7922bd8" (UID: "1deac451-e2e8-415b-8c88-6c28e7922bd8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.265911 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1deac451-e2e8-415b-8c88-6c28e7922bd8-kube-api-access-gvzvt" (OuterVolumeSpecName: "kube-api-access-gvzvt") pod "1deac451-e2e8-415b-8c88-6c28e7922bd8" (UID: "1deac451-e2e8-415b-8c88-6c28e7922bd8"). InnerVolumeSpecName "kube-api-access-gvzvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.265838 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1deac451-e2e8-415b-8c88-6c28e7922bd8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1deac451-e2e8-415b-8c88-6c28e7922bd8" (UID: "1deac451-e2e8-415b-8c88-6c28e7922bd8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.363106 4903 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1deac451-e2e8-415b-8c88-6c28e7922bd8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.363138 4903 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1deac451-e2e8-415b-8c88-6c28e7922bd8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.363148 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvzvt\" (UniqueName: \"kubernetes.io/projected/1deac451-e2e8-415b-8c88-6c28e7922bd8-kube-api-access-gvzvt\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.475653 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8h46"] Mar 20 09:15:03 crc kubenswrapper[4903]: W0320 09:15:03.482757 4903 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf31b1c28_b1bd_47c5_8911_db9b91611f3c.slice/crio-9dd11d7e99cc517105cae3c479b0e273df1140952449d5ed210f91ed607fd592 WatchSource:0}: Error finding container 9dd11d7e99cc517105cae3c479b0e273df1140952449d5ed210f91ed607fd592: Status 404 returned error can't find the container with id 9dd11d7e99cc517105cae3c479b0e273df1140952449d5ed210f91ed607fd592 Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.849198 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.849556 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-v9wmm" event={"ID":"1deac451-e2e8-415b-8c88-6c28e7922bd8","Type":"ContainerDied","Data":"38ae7426915d54b468616c2fec3519de0a0d488b897251f109e868f23530bee0"} Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.849612 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38ae7426915d54b468616c2fec3519de0a0d488b897251f109e868f23530bee0" Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.852612 4903 generic.go:334] "Generic (PLEG): container finished" podID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerID="3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40" exitCode=0 Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.852670 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8h46" event={"ID":"f31b1c28-b1bd-47c5-8911-db9b91611f3c","Type":"ContainerDied","Data":"3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40"} Mar 20 09:15:03 crc kubenswrapper[4903]: I0320 09:15:03.852705 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8h46" event={"ID":"f31b1c28-b1bd-47c5-8911-db9b91611f3c","Type":"ContainerStarted","Data":"9dd11d7e99cc517105cae3c479b0e273df1140952449d5ed210f91ed607fd592"} Mar 20 09:15:04 crc kubenswrapper[4903]: I0320 09:15:04.294632 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk"] Mar 20 09:15:04 crc kubenswrapper[4903]: I0320 09:15:04.299879 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566590-kcdfk"] Mar 20 09:15:04 crc kubenswrapper[4903]: I0320 09:15:04.864090 4903 generic.go:334] "Generic (PLEG): container finished" podID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerID="bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb" exitCode=0 Mar 20 09:15:04 crc kubenswrapper[4903]: I0320 09:15:04.864210 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8h46" event={"ID":"f31b1c28-b1bd-47c5-8911-db9b91611f3c","Type":"ContainerDied","Data":"bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb"} Mar 20 09:15:05 crc kubenswrapper[4903]: I0320 09:15:05.509648 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7d925a-a144-49cd-a061-95a3041145b9" path="/var/lib/kubelet/pods/3e7d925a-a144-49cd-a061-95a3041145b9/volumes" Mar 20 09:15:05 crc kubenswrapper[4903]: I0320 09:15:05.874692 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8h46" event={"ID":"f31b1c28-b1bd-47c5-8911-db9b91611f3c","Type":"ContainerStarted","Data":"40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7"} Mar 20 09:15:05 crc kubenswrapper[4903]: I0320 09:15:05.897323 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t8h46" podStartSLOduration=2.456195219 podStartE2EDuration="3.897303658s" podCreationTimestamp="2026-03-20 09:15:02 +0000 UTC" firstStartedPulling="2026-03-20 09:15:03.853953271 +0000 UTC m=+3129.070853586" lastFinishedPulling="2026-03-20 09:15:05.29506169 +0000 UTC m=+3130.511962025" observedRunningTime="2026-03-20 09:15:05.897184316 +0000 UTC m=+3131.114084661" watchObservedRunningTime="2026-03-20 09:15:05.897303658 +0000 UTC m=+3131.114203983" Mar 20 09:15:12 crc kubenswrapper[4903]: I0320 09:15:12.891074 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:12 crc kubenswrapper[4903]: I0320 09:15:12.891759 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:12 crc kubenswrapper[4903]: I0320 09:15:12.960078 4903 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:13 crc kubenswrapper[4903]: I0320 09:15:13.032238 4903 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:13 crc kubenswrapper[4903]: I0320 09:15:13.199547 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8h46"] Mar 20 09:15:14 crc kubenswrapper[4903]: I0320 09:15:14.937097 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t8h46" podUID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerName="registry-server" containerID="cri-o://40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7" gracePeriod=2 Mar 20 09:15:15 crc kubenswrapper[4903]: I0320 09:15:15.892254 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:15 crc kubenswrapper[4903]: I0320 09:15:15.947453 4903 generic.go:334] "Generic (PLEG): container finished" podID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerID="40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7" exitCode=0 Mar 20 09:15:15 crc kubenswrapper[4903]: I0320 09:15:15.947538 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8h46" event={"ID":"f31b1c28-b1bd-47c5-8911-db9b91611f3c","Type":"ContainerDied","Data":"40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7"} Mar 20 09:15:15 crc kubenswrapper[4903]: I0320 09:15:15.947627 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t8h46" event={"ID":"f31b1c28-b1bd-47c5-8911-db9b91611f3c","Type":"ContainerDied","Data":"9dd11d7e99cc517105cae3c479b0e273df1140952449d5ed210f91ed607fd592"} Mar 20 09:15:15 crc kubenswrapper[4903]: I0320 09:15:15.947652 4903 scope.go:117] "RemoveContainer" containerID="40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7" Mar 20 09:15:15 crc kubenswrapper[4903]: I0320 09:15:15.947551 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t8h46" Mar 20 09:15:15 crc kubenswrapper[4903]: I0320 09:15:15.964900 4903 scope.go:117] "RemoveContainer" containerID="bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb" Mar 20 09:15:15 crc kubenswrapper[4903]: I0320 09:15:15.983905 4903 scope.go:117] "RemoveContainer" containerID="3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.005047 4903 scope.go:117] "RemoveContainer" containerID="40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7" Mar 20 09:15:16 crc kubenswrapper[4903]: E0320 09:15:16.005814 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7\": container with ID starting with 40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7 not found: ID does not exist" containerID="40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.005944 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7"} err="failed to get container status \"40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7\": rpc error: code = NotFound desc = could not find container \"40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7\": container with ID starting with 40d7aa3fcb12a7ed0b6617d03ae55bea7641b2bdb0a761659174f37656003ea7 not found: ID does not exist" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.006079 4903 scope.go:117] "RemoveContainer" containerID="bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb" Mar 20 09:15:16 crc kubenswrapper[4903]: E0320 09:15:16.006551 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb\": container with ID starting with bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb not found: ID does not exist" containerID="bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.006584 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb"} err="failed to get container status \"bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb\": rpc error: code = NotFound desc = could not find container \"bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb\": container with ID starting with bc8834bd0862dc85fd4ab06c76fd35cec6cdfcaf51c0ee03fda3b7685e612ecb not found: ID does not exist" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.006600 4903 scope.go:117] "RemoveContainer" containerID="3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40" Mar 20 09:15:16 crc kubenswrapper[4903]: E0320 09:15:16.006864 4903 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40\": container with ID starting with 3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40 not found: ID does not exist" containerID="3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.006882 4903 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40"} err="failed to get container status \"3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40\": rpc error: code = NotFound desc = could not find container \"3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40\": container with ID starting with 3058f2ca936342f948ac0d18fd9270296470fd6ad800f4a7e102701df7eb0c40 not found: ID does not exist" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.047526 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-catalog-content\") pod \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.047899 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-utilities\") pod \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.047941 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szqg8\" (UniqueName: \"kubernetes.io/projected/f31b1c28-b1bd-47c5-8911-db9b91611f3c-kube-api-access-szqg8\") pod \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\" (UID: \"f31b1c28-b1bd-47c5-8911-db9b91611f3c\") " Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.048832 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-utilities" (OuterVolumeSpecName: "utilities") pod "f31b1c28-b1bd-47c5-8911-db9b91611f3c" (UID: "f31b1c28-b1bd-47c5-8911-db9b91611f3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.050513 4903 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.056184 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31b1c28-b1bd-47c5-8911-db9b91611f3c-kube-api-access-szqg8" (OuterVolumeSpecName: "kube-api-access-szqg8") pod "f31b1c28-b1bd-47c5-8911-db9b91611f3c" (UID: "f31b1c28-b1bd-47c5-8911-db9b91611f3c"). InnerVolumeSpecName "kube-api-access-szqg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.082868 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f31b1c28-b1bd-47c5-8911-db9b91611f3c" (UID: "f31b1c28-b1bd-47c5-8911-db9b91611f3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.152187 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szqg8\" (UniqueName: \"kubernetes.io/projected/f31b1c28-b1bd-47c5-8911-db9b91611f3c-kube-api-access-szqg8\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.152223 4903 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31b1c28-b1bd-47c5-8911-db9b91611f3c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.283361 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8h46"] Mar 20 09:15:16 crc kubenswrapper[4903]: I0320 09:15:16.290665 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t8h46"] Mar 20 09:15:17 crc kubenswrapper[4903]: I0320 09:15:17.504508 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" path="/var/lib/kubelet/pods/f31b1c28-b1bd-47c5-8911-db9b91611f3c/volumes" Mar 20 09:15:50 crc kubenswrapper[4903]: I0320 09:15:50.833912 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:15:50 crc kubenswrapper[4903]: I0320 09:15:50.834918 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.196811 4903 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566636-c8s2p"] Mar 20 09:16:00 crc kubenswrapper[4903]: E0320 09:16:00.199060 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerName="extract-content" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.199205 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerName="extract-content" Mar 20 09:16:00 crc kubenswrapper[4903]: E0320 09:16:00.199341 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerName="extract-utilities" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.199462 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerName="extract-utilities" Mar 20 09:16:00 crc kubenswrapper[4903]: E0320 09:16:00.199599 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerName="registry-server" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.199717 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerName="registry-server" Mar 20 09:16:00 crc kubenswrapper[4903]: E0320 09:16:00.199850 4903 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1deac451-e2e8-415b-8c88-6c28e7922bd8" containerName="collect-profiles" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.199973 4903 state_mem.go:107] "Deleted CPUSet assignment" podUID="1deac451-e2e8-415b-8c88-6c28e7922bd8" containerName="collect-profiles" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.200350 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="1deac451-e2e8-415b-8c88-6c28e7922bd8" containerName="collect-profiles" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.200509 4903 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31b1c28-b1bd-47c5-8911-db9b91611f3c" containerName="registry-server" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.201354 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-c8s2p" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.205776 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.218957 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566636-c8s2p"] Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.235094 4903 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-smc2q" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.235321 4903 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.341189 4903 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxxjm\" (UniqueName: \"kubernetes.io/projected/b5ff2924-5ddc-410b-acc0-3b70a7ee24ad-kube-api-access-fxxjm\") pod \"auto-csr-approver-29566636-c8s2p\" (UID: \"b5ff2924-5ddc-410b-acc0-3b70a7ee24ad\") " pod="openshift-infra/auto-csr-approver-29566636-c8s2p" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.442548 4903 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxjm\" (UniqueName: \"kubernetes.io/projected/b5ff2924-5ddc-410b-acc0-3b70a7ee24ad-kube-api-access-fxxjm\") pod \"auto-csr-approver-29566636-c8s2p\" (UID: \"b5ff2924-5ddc-410b-acc0-3b70a7ee24ad\") " pod="openshift-infra/auto-csr-approver-29566636-c8s2p" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.473410 4903 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxjm\" (UniqueName: \"kubernetes.io/projected/b5ff2924-5ddc-410b-acc0-3b70a7ee24ad-kube-api-access-fxxjm\") pod \"auto-csr-approver-29566636-c8s2p\" (UID: \"b5ff2924-5ddc-410b-acc0-3b70a7ee24ad\") " pod="openshift-infra/auto-csr-approver-29566636-c8s2p" Mar 20 09:16:00 crc kubenswrapper[4903]: I0320 09:16:00.554815 4903 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-c8s2p" Mar 20 09:16:01 crc kubenswrapper[4903]: I0320 09:16:01.047317 4903 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566636-c8s2p"] Mar 20 09:16:01 crc kubenswrapper[4903]: I0320 09:16:01.063595 4903 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:16:01 crc kubenswrapper[4903]: I0320 09:16:01.343755 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566636-c8s2p" event={"ID":"b5ff2924-5ddc-410b-acc0-3b70a7ee24ad","Type":"ContainerStarted","Data":"78987214ba70ce29158487aab312489472078764712f8f630a03ffc9cd10baee"} Mar 20 09:16:02 crc kubenswrapper[4903]: I0320 09:16:02.353993 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566636-c8s2p" event={"ID":"b5ff2924-5ddc-410b-acc0-3b70a7ee24ad","Type":"ContainerStarted","Data":"ede8b97b2a5f1758999bfaf168b6a79f4649c9eff36dfe580e8807b3bcf17252"} Mar 20 09:16:02 crc kubenswrapper[4903]: I0320 09:16:02.376341 4903 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566636-c8s2p" podStartSLOduration=1.393594243 podStartE2EDuration="2.37632062s" podCreationTimestamp="2026-03-20 09:16:00 +0000 UTC" firstStartedPulling="2026-03-20 09:16:01.063006262 +0000 UTC m=+3186.279906617" lastFinishedPulling="2026-03-20 09:16:02.045732669 +0000 UTC m=+3187.262632994" observedRunningTime="2026-03-20 09:16:02.370225923 +0000 UTC m=+3187.587126238" watchObservedRunningTime="2026-03-20 09:16:02.37632062 +0000 UTC m=+3187.593220945" Mar 20 09:16:02 crc kubenswrapper[4903]: I0320 09:16:02.965890 4903 scope.go:117] "RemoveContainer" containerID="8025abeb1d5e5fe5577466b86a2e7817694337715c2c3c53390deaa90e298ff1" Mar 20 09:16:03 crc kubenswrapper[4903]: I0320 09:16:03.366992 4903 generic.go:334] "Generic (PLEG): container finished" podID="b5ff2924-5ddc-410b-acc0-3b70a7ee24ad" containerID="ede8b97b2a5f1758999bfaf168b6a79f4649c9eff36dfe580e8807b3bcf17252" exitCode=0 Mar 20 09:16:03 crc kubenswrapper[4903]: I0320 09:16:03.367100 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566636-c8s2p" event={"ID":"b5ff2924-5ddc-410b-acc0-3b70a7ee24ad","Type":"ContainerDied","Data":"ede8b97b2a5f1758999bfaf168b6a79f4649c9eff36dfe580e8807b3bcf17252"} Mar 20 09:16:04 crc kubenswrapper[4903]: I0320 09:16:04.682390 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-c8s2p" Mar 20 09:16:04 crc kubenswrapper[4903]: I0320 09:16:04.814644 4903 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxxjm\" (UniqueName: \"kubernetes.io/projected/b5ff2924-5ddc-410b-acc0-3b70a7ee24ad-kube-api-access-fxxjm\") pod \"b5ff2924-5ddc-410b-acc0-3b70a7ee24ad\" (UID: \"b5ff2924-5ddc-410b-acc0-3b70a7ee24ad\") " Mar 20 09:16:04 crc kubenswrapper[4903]: I0320 09:16:04.819819 4903 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ff2924-5ddc-410b-acc0-3b70a7ee24ad-kube-api-access-fxxjm" (OuterVolumeSpecName: "kube-api-access-fxxjm") pod "b5ff2924-5ddc-410b-acc0-3b70a7ee24ad" (UID: "b5ff2924-5ddc-410b-acc0-3b70a7ee24ad"). InnerVolumeSpecName "kube-api-access-fxxjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:16:04 crc kubenswrapper[4903]: I0320 09:16:04.917254 4903 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxxjm\" (UniqueName: \"kubernetes.io/projected/b5ff2924-5ddc-410b-acc0-3b70a7ee24ad-kube-api-access-fxxjm\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:05 crc kubenswrapper[4903]: I0320 09:16:05.387303 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566636-c8s2p" event={"ID":"b5ff2924-5ddc-410b-acc0-3b70a7ee24ad","Type":"ContainerDied","Data":"78987214ba70ce29158487aab312489472078764712f8f630a03ffc9cd10baee"} Mar 20 09:16:05 crc kubenswrapper[4903]: I0320 09:16:05.387932 4903 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78987214ba70ce29158487aab312489472078764712f8f630a03ffc9cd10baee" Mar 20 09:16:05 crc kubenswrapper[4903]: I0320 09:16:05.387402 4903 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-c8s2p" Mar 20 09:16:05 crc kubenswrapper[4903]: I0320 09:16:05.434314 4903 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-7glnd"] Mar 20 09:16:05 crc kubenswrapper[4903]: I0320 09:16:05.440166 4903 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-7glnd"] Mar 20 09:16:05 crc kubenswrapper[4903]: I0320 09:16:05.501085 4903 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72102004-4421-42c4-8325-cb2e927d45fd" path="/var/lib/kubelet/pods/72102004-4421-42c4-8325-cb2e927d45fd/volumes" Mar 20 09:16:20 crc kubenswrapper[4903]: I0320 09:16:20.833309 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:16:20 crc kubenswrapper[4903]: I0320 09:16:20.834239 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:16:50 crc kubenswrapper[4903]: I0320 09:16:50.834393 4903 patch_prober.go:28] interesting pod/machine-config-daemon-2ndsj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:16:50 crc kubenswrapper[4903]: I0320 09:16:50.835304 4903 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:16:50 crc kubenswrapper[4903]: I0320 09:16:50.835399 4903 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" Mar 20 09:16:50 crc kubenswrapper[4903]: I0320 09:16:50.836535 4903 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"545e88d8f513e78763e5a71448fe99f0796ece0e27da70384e801502f7a18375"} pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:16:50 crc kubenswrapper[4903]: I0320 09:16:50.836646 4903 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerName="machine-config-daemon" containerID="cri-o://545e88d8f513e78763e5a71448fe99f0796ece0e27da70384e801502f7a18375" gracePeriod=600 Mar 20 09:16:50 crc kubenswrapper[4903]: E0320 09:16:50.965303 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:16:51 crc kubenswrapper[4903]: I0320 09:16:51.874170 4903 generic.go:334] "Generic (PLEG): container finished" podID="0e67af70-4211-4077-8f2b-0a00b8069e5a" containerID="545e88d8f513e78763e5a71448fe99f0796ece0e27da70384e801502f7a18375" exitCode=0 Mar 20 09:16:51 crc kubenswrapper[4903]: I0320 09:16:51.874222 4903 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" event={"ID":"0e67af70-4211-4077-8f2b-0a00b8069e5a","Type":"ContainerDied","Data":"545e88d8f513e78763e5a71448fe99f0796ece0e27da70384e801502f7a18375"} Mar 20 09:16:51 crc kubenswrapper[4903]: I0320 09:16:51.874256 4903 scope.go:117] "RemoveContainer" containerID="ab88589d91d0147a568e8b91cd68b845b771b73e7c864440e34828f1fbe98ceb" Mar 20 09:16:51 crc kubenswrapper[4903]: I0320 09:16:51.874738 4903 scope.go:117] "RemoveContainer" containerID="545e88d8f513e78763e5a71448fe99f0796ece0e27da70384e801502f7a18375" Mar 20 09:16:51 crc kubenswrapper[4903]: E0320 09:16:51.874930 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:17:03 crc kubenswrapper[4903]: I0320 09:17:03.047184 4903 scope.go:117] "RemoveContainer" containerID="42b0a222e747a5287375a9fcf148b31d4b25740f3bb3ece1812c269d07e24ac6" Mar 20 09:17:04 crc kubenswrapper[4903]: I0320 09:17:04.491878 4903 scope.go:117] "RemoveContainer" containerID="545e88d8f513e78763e5a71448fe99f0796ece0e27da70384e801502f7a18375" Mar 20 09:17:04 crc kubenswrapper[4903]: E0320 09:17:04.492803 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a" Mar 20 09:17:15 crc kubenswrapper[4903]: I0320 09:17:15.502609 4903 scope.go:117] "RemoveContainer" containerID="545e88d8f513e78763e5a71448fe99f0796ece0e27da70384e801502f7a18375" Mar 20 09:17:15 crc kubenswrapper[4903]: E0320 09:17:15.503963 4903 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2ndsj_openshift-machine-config-operator(0e67af70-4211-4077-8f2b-0a00b8069e5a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2ndsj" podUID="0e67af70-4211-4077-8f2b-0a00b8069e5a"